Last November, researchers in China announced they’d made an alarming discovery: a new bacterial superbug lurking in the food chain.
Through routine animal testing, they found a high number of E. coli samples that were resistant to Colistin, an antibiotic used as a last line of defense against the deadliest infections. The samples, which had come from a commercial pig farm near Shanghai, confirmed that after years of dumping tons of the drug into animal feed, a strain of the bacteria had developed that could no longer be killed by one of the most effective and toxic antibiotics in existence.
That Colistin was no longer effective against this strain of E. coli wasn't even the worst—or most surprising—part of the news. The researchers also found the bacteria had developed a new gene, mcr-1, that allows the resistance to jump from cell to cell, strain to strain and even between different species of bacteria. When paired with more aggressive microbes like Klebsiella, a genus of bacteria commonly found in hospitals that can cause pneumonia, this gene could help create organisms resistant to all known treatments.
"This is a watershed moment," says Dr. Yohei Doi, an expert on antimicrobial resistance at the University of Pittsburgh who assisted the researchers in China. Along with the larger infectious disease community, Doi has been warning for years that the overuse of antibiotics in both agriculture and medicine is pushing us toward a future in which routine infections become even harder to fight and more fatal. Unlike other types of drugs, antibiotics can lose their potency over time as the microbes they are designed to defeat mutate—and the mcr-1 gene is the latest example. "What's really different this time," says Doi, "is how quickly and easily this gene can transmit from one type of E. coli to another."
The emergence of mcr-1, which has already been detected on at least four continents and in 18 countries, underscores a larger issue confronting infectious disease experts and the medical community more broadly. For much of the 20th century, the development of new antibiotics dominated the field and kept pace with the illnesses they were designed to treat. Patients, for the most part, received the lifesaving care they needed, but there was also a downside: Other types of therapies—especially those that worked to boost the immune response—fell by the wayside.
As the risk of antibiotic resistance becomes more acute, a small but growing group of scientists, doctors and medical researchers are trying to shift some of the attention and funding back to projects that examine how human immunity can be harnessed to combat fatal infections. They aren't advocating against developing new antibiotics, which they readily acknowledge will be necessary to confront the superbugs that are killing an estimated 700,000 worldwide each year. However, it's shortsighted, they contend, to focus narrowly on destroying the bacteria with drugs while overlooking the fact that antibodies—the body's natural defense mechanism—have a critical role to play. "The bottom line is that the bacteria now develop resistance to anti-infectious agents faster than we can develop the anti-infectious agents," says Dr. Jean-Laurent Casanova, a professor at Rockefeller University who studies how genetic coding can make a person more susceptible to disease. "If we continue to rely solely on antibiotics, we are going to have a problem."
Different classes of antibiotics kill bacteria in different ways. Some destroy the cell wall while others interfere with a part of the metabolic process. Antibodies, on the other hand, are proteins that work in a number of ways to clear infections. They bind to bacteria and make it easier for white blood cells to ingest them. They can induce other proteins in the blood, known as complement proteins, to cover the surface of the foreign invaders and again make it easier for white blood cells to dispose of them. Once it's recruited, the complement system itself can eliminate certain organisms by punching holes in their cell walls. These processes don't usually harm the antibodies, which can continue hunting for as long as necessary. Thus, the immune system attacks bacteria in multiple places, making it nearly impossible for them to evolve and become resistant.
How we get from having the bug to having the disease is not well understood. Since the 1860s, germ theory—or the idea that ultimate responsibility for an illness lies with the pathogen—has monopolized the discussion. But, while a microbe is clearly needed for an illness to take root, not everyone who carries a given disease-causing organism gets sick from it. "Our current way of thinking about it is outdated," says Dr. Liise-anne Pirofski, chief of the infectious diseases division at the Albert Einstein College of Medicine. “We know that there are many instances where the same microbe is totally innocuous in one person but kills another." For example, only 1 in 1,000 infected children develop life-threatening malaria. Fewer than 10 percent of people carrying tuberculosis get the full-blown disease. During the flu pandemic of 1918, more than 90 percent of those who got the virus lived. In addition, scientists know that sometimes the problem isn't the bacteria itself, but the toxins it releases as it reproduces within its human host. Antibodies also have an advantage here, as they can help clear the poison from the body. Antibiotics can kill only the bug.
"The entire mindset has been, 'Kill the bug. Kill the bug. Kill the bug,'" says Dr. Arturo Casadevall, a microbiologist and immunologist at Johns Hopkins University. "The field of infectious disease is essentially stuck. Now we are paying the price." As drugs like Colistin become less effective, the number of deaths attributed to drug-resistant organisms is expected to rise to 10 million by 2050.
And yet, the public health response to the growing problem of drug-resistant bacteria remains highly focused on speeding up the process of getting new antibiotics through the regulatory pipeline. In March 2015, the White House released a wide-ranging national action plan that emphasized the need for new drugs. The House of Representatives passed legislation in July that would allow pharmaceutical companies to conduct shorter and smaller clinical trials for antibiotics in the hope that new medications would get to market faster. In 2016, the National Institutes of Health (NIH) plans to spend $461 million studying antimicrobial resistance—an increase of $100 million over last year. Only a small fraction of this attention has been given to antibodies, vaccines and other potential treatments.
The good news is that in the last five years there has been a broader resurgence in antibody research, says Dr. Brad Spellberg, a leading expert in drug-resistant infections at the Los Angeles County-University of Southern California Medical Center. Spellberg's lab is trying to harness antibodies to defeat deadly pathogens, including Acinetobacter, a genus of bacteria found in hospitals that is resistant to most antibiotics available today. This NIH-funded project has already yielded several promising antibodies, at least one of which has protected mice from lethal bacteria. Then there’s the groundbreaking research being done by SAB Biotherapeutics, a small firm in South Dakota that is breeding cows that produce human antibodies. Company officials say that when injected with diseases like Methicillin-resistant Staphylococcus aureus, more commonly known as MRSA, these cows make effective antibodies, which SAB’s researchers believe could, in the future, be given to humans to fight off the infection. MedImmune, a biotech firm based in Maryland, is conducting clinical trials on an antibody that targets a toxin produced by Staphylococcus bacteria.
Two months after the discovery of the Colistin-resistant E. coli was reported, the NIH announced it was spending $5 million on 24 programs aimed at developing what it is calling "nontraditional therapies" for antibiotic resistance. Spellberg estimates that there are a few dozen other university facilities and research firms—mostly in the biotech sphere—doing similar work. The large pharmaceutical companies have yet to see a big breakthrough that proves they can turn a hefty profit. That may soon be changing. "For a long time, there was little academic interest in antibodies for bacterial infections because every time resistance would catch up with the drugs we had, a company would come up with the next antibiotic," says Spellberg. "But the drug companies have decided antibiotics aren't profitable enough and have largely shut down their work in this area. So people are starting to think maybe it's time to take another look at immunotherapies."
There is nothing novel about this approach. Antibodies have been used since the late 19th century, when immune-boosting serums were developed to treat patients with diphtheria and tetanus, which at the time claimed thousands of lives annually. (The first Nobel Prize for medicine was awarded in 1901 to Emil von Behring for his work in this field.) By 1910, researchers in New York City had developed an antibody serum for pneumonia that would later catch the attention of the state's health commissioner, Thomas Parran Jr. After Parran became surgeon general in 1936, he funded a nationwide pneumonia control program that distributed the serum to people in roughly half of the states.
The interest in serums declined throughout the 1930s. Small-scale production of penicillin, the first widely used antibiotic, started in the early 1940s. As the U.S. prepared to enter World War II, the federal government partnered with the big pharmaceutical companies to expand manufacturing capacity. It was a Manhattan Project–sized effort. By the end of the war, antibiotics had entered the mainstream of American life, revolutionizing the practice of medicine. All of the sudden, doctors could cure patients of diseases that often proved fatal. "They were a magic bullet," says medical historian Dr. Scott Podolsky. "Antibiotics ushered in the golden age of medicine."
It's this powerful legacy that the proponents of immunotherapies are up against. Antibiotics were one of the most important discoveries of the 20th century. Convincing the research community that a 19th-century idea is the future of medicine is a difficult task, but the discovery of the mcr-1 gene may be the warning sign that jump-starts the conversation.