Science is known to be fatal; it kills people — this is all but a cliché. World War I was the chemists’ war: chemists developed chlorine as a bleach and a disinfectant, then turned it into chlorine gas, which flooded (along with other gases) into enemy trenches. World War II was the physicists’ war: physicists studied atomic fission to understand the constituents of matter, then turned it into the atom bomb. World War III has several possible scientific sponsors, but the current frontrunners seem to be biologists. Among other things, while trying to fight infectious diseases, they’ve just published the mutations by which a lethal virus could be spread more widely. What is it that we haven’t learned? What’s so hard about this?
Last week, Science magazine, after months of deliberations and changed minds and variously expressed opinions, not only published the mutations made in the H5N1 virus, but also discussed the problems raised by humanistic science that can be turned into military technologies. The jargon for these technologies is “dual use.” Dual use. Can be used for good, can be used to kill, either one. This choice is not new.
The discussion isn’t new either, or easy: for starters, scientific openness vs. national security, ease of replication vs. size of risk, the logistical swamp of previewing publications, the utter impossibility of predicting what science might end up dual-use, the near-impossibility of keeping scientists’ mouths shut. Biologists had in fact learned from the chemists’ and physicists’ wars and had created a board called the National Science Advisory Board for Biosecurity (NSABB) to weigh the issues of dual use research. But the issues truly are insanely complex and the board, though it recommended publication of the H5N1 data, has no settled policies. Biologists have no consensus on oversight of dual use research, said a lawyer at the end of an article in Nature magazine, and governments in the past “have simply kicked the can down the road.”
But surely, I thought, couldn’t biologists just look at how physicists had solved the problem? Hadn’t physicists formed something like the NSABB long since?
No, they haven’t, said Alex Wellerstein, an historian at the American Institute of Physics. Nothing like it exists now, he said, though in the days before the Manhattan Project, physicists did have a kind of NSABB analog. Leo Szilard, who had figured out that fissioning atoms implied a bomb, argued that his community should censor itself. When his argument failed and the knowledge became public, he convinced a colleague at the National Academy of Sciences to create a an oversight committee that could delay papers on, say, highly-fissionable plutonium until after the war. The committee later morphed into government committees which got absorbed into the Manhattan Project and then into the military. Once the government and the military took over oversight from the scientists, all research in fission was – and still is – classified. “Technically there is no need for ‘self-censorship’ in nuclear physics,” Wellerstein said, ‘because there is already simply censorship.”
So imitating physicists won’t help: the biology of viral mutations is medically useful and can’t be classfied as fission can. As Steven Aftergood, secrecy expert at the Federation of American Scientists, says, “there is no civilian use for nuclear explosives.”
Wellerstein, on his beautifully researched and written blog, sums up: “Within the next four decades or so, every university bio lab in the country (and soon after, the world) is probably going to have the capability to make customized flus, plagues, and other nasties.” He sees no reason to believe that all the biologists at all the labs will self-censor their capabilities. And right now the NSABB will remain stymied on policy. “Bottom line,” Wellerstein writes: “We’re used to thinking of the bomb as the hard case. It’s actually the easy case. Everything gets a lot harder from here on out.”
Quotes, except where noted, are from emails to me.