Spilled smallpox, missing SARS, and rogue scientists with mutant H1N1. If you’re not scared, you should be.
In every one of the dozens of bioterrorism meetings I have attended over the last three decades, experts have stated unequivocally that the worst-case outbreak scenario would be smallpox in the hands of bad guys. And the most alarming other microbial possibilities that follow? Well, anthrax was always somewhere in the top five.
And this is why, as political leaders were repeatedly told over the years, scientists needed to safely store these dangerous microbes in high-security labs, conducting their work with these germs under the tightest security possible to ensure protection of public safety. The safest laboratories, we were repeatedly informed, were the Ivanovsky Institute of Virology in Moscow, the U.S. Army Medical Research Institute of Infectious Diseases (USAMRIID), the National Institutes of Health (NIH) in Bethesda, Maryland, and the Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia.
Safe there -- nothing to worry about.
But last week, while most Americans were taking time off to celebrate the July Fourth holiday, scientists were secretly scrambling at the NIH and CDC to identify, seal, and transport six vials of smallpox that were discovered in a storeroom at the National Institutes of Health, tipped over on their sides, cotton stoppers protruding.
Now that the security of all of these facilities has been proven -- to put it politely -- "flawed," it seems wise to rethink the larger notion of "biosafety" in our time of gain-of-function research, synthetic biology, and directed evolution. As I recently laid out during a TEDx talk, we are hard pressed to demonstrate that public safety is intact for the organisms we know, like smallpox and anthrax, much less for the new, previously unknown ones that are being created now in less secure facilities, like high school labs.
The revelation that smallpox vials have gone unnoticed inside the NIH for more than half a century is gobsmacking. After the anthrax and 9/11 attacks of 2001, every law enforcement agency and government laboratory was tasked by the Bush administration with tracking down a list of "special pathogens," with smallpox ranking No. 1 on that inventory. Labs across the nation were turned upside down with inspections and security provisions under the 2001 Patriot Act and subsequent anti-terrorism laws. Overall security was greatly increased at all government labs -- especially at the NIH. If any research facility in the entire United States should have passed muster, it was the NIH.
Apparently, not so.
Of course, USAMRIID would prove to be the first lab compromised, as its Maryland facility came under scrutiny for possible culpability in the anthrax mailings of 2001. As the FBI bore down upon the Army lab, it was revealed that dozens of dangerous pathogen samples were unaccounted for; researchers were in the habit of taking samples with them as they traveled or relocated to other facilities, and record-keeping was sloppy, at best. Eventually, the FBI concluded that the mailings of anthrax were executed by a USAMRIID biologist, Bruce Ivins, who committed suicide on July 29, 2008. As I detailed in my 2011 book, I Heard the Sirens Scream, controversy continues to shroud every aspect of the FBI investigation except, perhaps, one: The fact that USAMRIID was incapable of providing reasonable biosecurity.
The gaping holes in the USAMRIID safety net were all the more incredible given that the institution was involved throughout the 1990s in efforts to demilitarize the old Soviet Biopreparat program, the world's largest biological weapons effort, built largely between 1972 and 1988. In those facilities the Soviets weaponized smallpox and other microbes, making them deliverable to large populations inside missile warheads, sprayed from the back of airplanes, or in food and water supplies. Chief among this arsenal of life forms were smallpox and anthrax. The very existence of Biopreparat was unknown to the non-Soviet world until British intelligence discovered it, and then-Prime Minister Margaret Thatcher disclosed its existence to President George H.W. Bush. By then the Soviet Union had collapsed, Boris Yeltsin was running Russia, and the very existence of the Russian Federation was tenuous. American smallpox expert D.A. Henderson leapt to his phone to call the White House when he watched on television the tanks of an attempted coup roll towards Yeltsin's quarters, passing the Ivanovsky Institute. Under an agreement hatched in 1978 with the eradication of smallpox -- a pact Henderson had helped create -- all remaining samples of the smallpox virus were to be stored under maximum security in just two places: Russia's Ivanovsky Institute and the CDC's Biosafety Level 4 (BSL-4) laboratory in Atlanta, Georgia. Henderson saw those rebel tanks rolling past the Ivanovsky and thought, "Oh, my God, the smallpox!"
Only then did the World Health Organization and U.S. government learn that Russia's smallpox samples had long since been secretly moved out of the Ivanovsky, to a Siberian lab known as VECTOR -- a key research component of the Biopreparat weapons program. In that enormous facility located in the woods outside Novosibirsk, which I visited in 1997, Soviet scientists tried to make smallpox, which kills about a third of the unimmunized people it infects, into an even deadlier virus.
At least, government officials reassured themselves, the smallpox stored in the United States was safe. Locked inside the ultra-high security CDC Special Pathogens Laboratory, surely it was secure. By 2000, CDC officials were telling the Clinton administration that there was no problem with the safety of stored samples, but that the main defensive countermeasures in the event of a smallpox attack -- vaccines -- were an issue, amid evidence of deterioration of the nation's ancient smallpox vaccine supplies. So the old viruses were pulled out of the freezers, and a quiet program to generate new vaccine supplies -- a smallpox response plan -- was initiated. That program went into high gear after the 2001 anthrax mailings, when then-Secretary of Health and Human Services Tommy Thompson guaranteed on national television that every American would have a dose of smallpox vaccine with their name on it. Thompson was responding to "60 Minutes" interviewer Mike Wallace, who asked whether America was safe, and Thompson sought to reassure the nation with promises of new smallpox vaccines.
In the nearly 13 years since somebody stuffed anthrax spores into envelopes mailed to news organizations, U.S. Senators, and the Supreme Court, among others, the U.S. government has been at great pains to ensure the security of scientific research conducted on a list of dangerous pathogens. Moreover, U.S. officials have urged other nations to follow our example, building BSL-4 and slightly lower security BSL-3 labs, and limiting scientists' access to "special" microbes. American leaders have used numerous types of pressure, including export control laws and trade agreements, to push other nations into raising their biosafety standards to meet ours. And chief among the standard-bearers have been the NIH and CDC.
So it was shattering -- or ironically amusing, depending on one's point of view -- to learn that anthrax research at the CDC as recently as June may have resulted in lab slips that exposed about 85 employees to the microbe. The anthrax leak is still under investigation, and both the scale of exposures and the danger inherent in the biologically altered strain of anthrax have dropped considerably with each CDC pronouncement since the first on June 18. Nevertheless, the CDC -- the gold standard for lab safety -- had a breach involving the very organism that spawned our modern era of maximum lab security.
A month earlier, the premier French infectious diseases lab, Institut Pasteur, went on alert over missing tubes of SARS viruses. That was preceded by a lab accident involving buffalopox - smallpox for buffaloes -- in India. A long list of lab accidents in high-security facilities inside the United States dates back to 2006. The latest revelations regarding long-forgotten vials of smallpox at the NIH only put the nail in the coffin of inept biosecurity.
The CDC anthrax leak and NIH smallpox incident come on the heels of an at least equally worrisome microbe effort from University of Wisconsin scientist Yoshihiro Kawaoka. His June announcement that he had created mutant forms of the H1N1 influenza that can live inside the human body without triggering any immune response conjured immediate concerns about the safety of storing a pandemic-potential virus in an academic laboratory. The H1N1 flu spread widely around the world in 2009, causing the largest influenza pandemic since the 1968 so-called Hong Kong Flu. Fortunately, the 2009 flu wasn't terribly virulent. Kawaoka's experiments, executed in a BSL-3 lab over the last four years, resulted in a form of that flu that humans could not fight off. The same lab reconstructed the 1918 flu, which is conservatively estimated to have killed 60 million people worldwide in 18 months. These man-made microbes reside inside university freezers: Could these viruses escape their laboratory confines? Kawaoka, of course, says no, but no doubt the CDC said the same of anthrax, and NIH of smallpox.
In 2011 to 2012, Kawaoka was involved in another controversy, involving a far more virulent form of bird flu, H5N1. Fortunately, the H5N1 virus rarely infects people, but when it does, some 60 percent of them die of the disease. The H1N1, in contrast, killed less than 0.5 percent of the people it infected in 2009. Two labs, working independently, did similar experiments, altering the bird virus to make it capable of infecting mammals (including, presumably, humans). The other lab was located at Erasmus Medical Center in Rotterdam, the Netherlands, run by Ron Fouchier. Those H5N1 experiments -- so-called "gain-of-function" research -- proved enormously controversial. But after some months of debate within the scientific and global health communities, the work was published, and Fouchier and Kawaoka carried on. And then a lab in Harbin, China, genetically manipulated 127 versions of the H5N1 flu, five of which spread airborne between guinea pigs. And as new strains of flu have naturally emerged, such as H7N9 in 2013, Fouchier, Kawaoka, and their flu colleagues have called for further gain-of-function research.
Gain-of-function research is becoming a norm in biology, as researchers ask "what if" questions all of the time now, aided by synthetic biology. For example, both plague microbes have been sequenced -- that which caused the Black Death of the 14th century and the Justinian Plague of the sixth century -- and it is now possible to swap gene sequences between them to see which might be functionally more dangerous or transmissible. The "Philadelphia cholera" that swept the world in 1849 with devastating consequences has been fully sequenced and is now being genetically manipulated to better understand how it was so deadly.
The first synthetic cell was made four years ago. It is now possible to make changes in the genetics of microbes so swiftly and easily that the manufacture of novel micro-creatures is grist for high school, undergraduate, and amateur research mills. Indeed, in recent months the pace of such work has been breathtaking. One college student brought a long-frozen prehistoric plant to life. An entire yeast cell was man-made, and self-replicated, earlier this year. A team of researchers made two new types of DNA chemicals -- base pairs they dubbed X and Y. And then they made DNA comprised of its normal ACTG alphabet of genetic notes plus the two new XY notes. They made a novel life form based on ATGCYX, and it self-replicated. Work is now apace for "genetic editing" of primates, perhaps one day to include human beings, as well as food crops. Living computers, based on "genetic circuits," are just around the corner.
Last year I proposed a series of measures governments ought to follow to improve biosafety, not only for the old standbys of smallpox and anthrax but for the new and man-made organisms. I described the revolutions now unfolding in biology, and the dangers inherent in ignoring their inherent risks. While advocating in no uncertain terms that synthetic biology work ought to go forward, I called for less cavalier attitudes in both the scientific and government regulatory communities. USAMRIID has called for "meaningful standards" to guide all synthetic biology and gain-of-function work. Harvard University's Marc Lipsitch has proposed alternatives to gain-of-function methods and means for answering major public health questions about potentially dangerous naturally arising viruses.
The German government's Ethics Council has called for stronger guidelines for such research in Europe. The OECD has been trying to harmonize regulatory standards for biosecurity across Europe, North America, Japan, and other wealthy nations.
For more than 30 years, I have listened to the biosecurity debate in meetings large and small. Assurances have been made. Comfort was taken in the motion detectors around the CDC's building-within-a-building matryoshka doll-like Special Pathogens structure that nests smallpox and its deadly ilk deepest inside. The wise men and women of the NIH calmed nerves with their own special security precautions.
My worries are not appeased, my nerves are not calmed, and yours ought not be, either.