The Strange Connection Between Germs and Sherlock Holmes

The Remedy author Thomas Goetz on big data, germs and Sherlock Holmes.
Image may contain Face Human Person and Head

In his latest book, The Remedy, Thomas Goetz traces the rise of medical science in the late 19th century. Today, he says, we’re entering a new era of discovery. Jason Madara

Sherlock Holmes was as much a scientist as a detective. Maybe that’s because his creator, Arthur Conan Doyle, was influenced by a detective of science: Robert Koch, a German doctor who helped prove the existence of germs. In his new book, The Remedy, Thomas Goetz traces connections between the two men. The former executive editor of WIRED, Goetz now runs Iodine, a company he cofounded with ex-Google engineer Matt Mohebbi. The startup’s goal is to help people make better decisions about medications by combining their personal experiences with clinical research—in other words, giving us the tools to be our own medical detectives. We sat down with Goetz to learn more.


Why was Robert Koch so important?
He was a particularly modern kind of scientist—his genius was both in making discoveries and crafting the process and tools to enable discovery. His great claim to fame was discovering the bacteria that causes tuberculosis—the biggest killer of his time. But he also invented new microscope techniques that allowed him to do things like take photographs of what he saw, enabling scientists to actually prove they’d found the microbe. Even more important, Koch proposed a sequence of proofs known as Koch’s postulates, which created a process for proving causality. Those three things—the tools, the discoveries, and the method—pushed away the last doubts about the germ theory of disease and ushered in a new age of science.

Where does Arthur Conan Doyle come in?
As Koch’s discoveries began to gather the world’s attention, an ambitious doctor in rural England read about what Koch was doing and was inspired. This was Conan Doyle. Koch’s rigor, his method, his affinity for the microscope—these were all characteristics that Doyle would soon incorporate into the fictional detective he was cooking up, Sherlock Holmes. In the first stories, Holmes is very much a scientist at work, and you can see in his devotion and single-­mindedness the same traits that drove Koch.

In The Remedy, I was struck by the fact that the stature of scientists was not so high in the late 1800s.
There were so many different methods for science that it was difficult to establish credibility. But the process of moving from discovery in a lab to something that actually convinces society—that was really pioneered by these germ theorists and bacteriologists.


The 3 Ages of MedicineJust weeks after Robert Koch claimed he had discovered a remedy for TB in 1890, doctors all over Europe began testing the mystery substance on patients. That would be unthinkable in today’s world of randomized, double-blind, placebo-controlled trials, which often take years. But now, says Thomas Goetz, author of The Remedy, we’re due for a new age of medical research.
—Jason Kehe

The Free-for-All:
1850s–1940s
Although the first clinical trials (on citrus fruits and scurvy) date back to the 1740s, for centuries doctors acted without standards. Often that meant giving untested drugs to patients willy-nilly, based on the hope they just might work. The result was chaos—“closer to anarchy than experimen­tation,” Goetz writes.

The Rise of Trials:
1940s–2010s
If you believe a cure will work, as Koch did of his remedy, that bias can skew your results. So scientists began adopting a new protocol: the randomized, double-blind, placebo-controlled trial. Neither investigators nor patients know who’s getting treat­ment and who’s getting a placebo, for example. But while this method remains the gold standard—its rigor and years-long review pro­cess ensure credibility—it isn’t designed to account for unique experiences.

Beyond the Lab:
Post-2010
A few hundred subjects is a good-size trial—but there are billions of people in the world. Advances in comput­ing power now make it possible to analyze data at that scale. Goetz’s startup, Iodine, aims to combine traditional research findings with self-reported data it collects (say, a woman’s experience with a certain birth control pill). “It can be messy,” Goetz says, “but that information measures what’s happening in the real world, in real time.”

Koch’s ability to take photos of germs, to visualize them, was crucial to convincing people they existed. With Iodine, it seems like you’re up to something similar—visualizing health data to make it useful, right?
Whenever people face a decision about a drug—from finding the right dose to dealing with side effects—data can guide them to a good choice. There’s a great body of clinical research about medications: Pharmaceutical companies spend years and billions of dollars testing drugs to prove safety and efficacy. But ultimately, those experiments are only approximations of what the truth is. Because there are clinical numbers—say 14 percent of people taking a drug had dizziness—and then there’s the reality and what that dizziness actually felt like. That pragmatic reality is what Iodine captures. And the combination of clinical evidence and user experience is something we believe will allow people to make better decisions for themselves.

How does that work in practice?
With something like birth control pills, it can take a woman a year or more to find something that really works well for her, without side effects like mood swings or decreased libido. These side effects can be very personal, and a woman trying to make a choice is often operating in isolation. If she could compare her experience to clinical trials across a range of other hormone treatments, and even read what other women have experienced, she could find a different method that might work better for her. That’s impor­tant information, but you’re so often in the dark.

But that seems in direct opposition to what Koch was doing—he got us away from using personal anecdotes as data.
Not exactly. Koch and other scientists knew there were these things, germs, but they just couldn’t convince ­people. But guys like Joseph Lister, who pioneered sterile surgery, were able to make the case through a pragmatic argument. They were able to say that whatever the mechanism, fewer ­people die in surgeries with Lister’s procedures than without. There still wasn’t an ironclad causal explanation. The hygienists were making this pragmatic argument for better sanitation, often on completely the wrong hypotheses. But by giving the hygienists some scientific credibility, hygiene began to take off, and that’s when TB cases, which are a good proxy for general hygiene, really started to drop. By the time antibiotics came along 60 years later, TB cases were a fraction of what they had been 50 years before.

Even so, it took decades for germ theory to change public behavior. You write about how public spitting was rampant and a source of disease spread.
I love the spit stuff: Spitting was normal, and laws had to be passed to prevent it—all these major US cities had antispitting societies.

Is there anything that we’ll look back on a hundred years from now and think, wow, why were we doing that?
Even 50 years from now we will be fascinated that we put so much effort into creating science without making much effort to target that science at the people who need it. The fact that we are not mapping evidence to the individual is staggering. That’s the big opportunity. It seems like common sense to me, but it barely exists yet.