From 2008 to 2012, laboratories conducting research on potential bioterrorism weapons logged more than 1,100 accidents. That figure, chronicled in government reports obtained by USA Today, includes a case where two animals were accidentally infected with hog cholera, a virus that hasn’t been found in the U.S. since 1978. In another case, an uninvolved cow residing on a nearby farm became infected with brucellosis, a virus that can be passed to humans through dairy products. In about half of the incidents, lab workers had to be medically screened or treated after accidental exposure to a mishandled toxin, and in five cases, lab workers were infected or sickened. (All lab workers recovered.)
The Centers for Disease Control reports don’t include many details; federal bioterrorism laws prohibited the names of the errant labs and most specific information about the mishaps from being disclosed. But just the number of accidents alone adds volumes to the genre of doomsday headlines chronicling high-stakes bio-error.
Just last month, live samples of the deadly and highly infectious smallpox virus were found, forgotten, in a federal storage room. Another 327 vials of viruses that cause dengue fever and influenza, as well as the bacteria that causes spotted fever, were found along with them.
The same month, a CDC lab accidentally shipped a sample that had mistakenly been infected with H5N1 bird flu, the strain that has killed 386 people since 2003. A month prior, dozens of CDC lab workers were potentially exposed to live Anthrax. The lab director resigned shortly after.
Lab accidents are nothing new--there is significant evidence linking a lab accident to the re-emergence of H1N1 flu in 1977, for example. But recent fumbles bring up questions of lab regulation in an era of “gain of function” research, where researchers try to tweak germs to give them some new functionality, like the ability to jump from animals to humans, or to make an airborne virus out of one that is normally transmitted via bodily fluids.
This year, several experts have called for a moratorium on gain of function research, citing the potential for an accidental release to produce catastrophic results.
Marc Lipsitch, of Harvard University, and Alison P. Galvani, of Yale, argued in May that experiments tweaking H5N1 bird flu to be more contagious in mammals were just too risky. They cited estimates that put the rate of lab-caused infections in the U.S. at two per 1,000 lab-years of research. At that rate, a 10-year gain of function study, with 10 labs involved, would have a roughly 20 percent chance of infecting a lab-worker. That worker could set off a chain of transmission, potentially sparking a pandemic of a brand new strain of virus.
Viruses mutate all the time. That’s why new influenza vaccines come out every year--they’re developed over and over to respond to new strains that emerge. Part of the rationale for gain of function research is that, if scientists can anticipate the future mutation of a virus several years early (or anticipate a mutated strain that could be used in a bioterrorism attack) they can be ready with a corresponding vaccine.
But “it is unclear whether that rationale really holds up to scrutiny,” says Nick Bostrom, a professor at Oxford who studies potential existential threats to humanity. Developing a vaccine requires the presence of an exact strain, and predicting what strain might emerge from nature is not an exact science, so “it is not so clear how useful an artificial mutation would be.”
“Lab accidents happen. It is almost a routine occurrence,” Bostrom says, adding that if published gain of function research were to end up in the wrong hands, the result could be a pandemic.
The question of whether to conduct or publish such research also worries Rebecca DuBois, a structural biologist studying viral proteins at the University of California Santa Cruz.
“I think it needs to be a question, always, in the back of [a researcher’s] mind: should they do this research, and would the research really benefit the health of the public, and is that benefit stronger than the potential risks of that research?”
But, she adds, regulatory procedures for this type of research is already quite stringent. “We do have very lengthy approval processes before this kind of research gets done. There’s peer review of funding for this kind of research, and there are all sorts of approvals required… Any people who use [high-risk] facilities have to go through major training and get clearance from the Department of Justice.”
DuBois notes that the highly controversial H5N1 bird flu research was actually done with a strain that could be warded off with Tamiflu, a powerful and readily available flu treatment, but that detail was lost in the public panic.
“I feel like sometimes the media skews this kind of research as scientists just doing crazy experiments to see what happens. That is not the case. They’re highly planned and done in the best facilities we have and they were already weighed to have great benefits,” DuBois says. “It is important to recognize that.”