Dangerous Minds: Controlling the Publication of Biological Research

James G. Vanzant*

I. Introduction

In 1977, the greatest scourge of humanity finally lay defeated. Smallpox—a disease that “has afflicted human beings with more death, suffering, fear, and revulsion, over a longer period of time, and with wider geographic coverage” than any other—was at last eradicated by a concerted, world-wide effort spearheaded by the World Health Organization.[1] No other disease was quite so contagious, infecting up to half of those exposed to it by cough, sneeze, or contact with bodily fluids.[2] No other disease was quite so deadly, killing up to thirty percent of those infected. Estimates of the number who died of the disease in the past hundred years alone run as high as 500 million people.[3] Even today, there is no cure for the disease once it is contracted, and only the administration of an uncertain and costly vaccine that carries potentially life-threatening side effects can provide limited immunity.[4]

The World Health Organization’s ten-year effort, begun in 1967, relied on an extensive campaign of surveillance, containment, and vaccination and eventually resulted in the complete eradication of smallpox in the natural world.[5] Only two stores of the virus now exist, both under tight security.[6]

A smallpox outbreak today would be catastrophic—there are few, if any, individuals who possess a natural immunity from previous exposure to the virus and vaccination programs have largely ceased since 1977.[7] While there is a remote possibility of an accidental outbreak from the two laboratories that maintain the virus, the true danger is a deliberate attack by a terrorist group. The 2001 anthrax attacks in Washington, D.C. and the resulting social, political, and economic disruption demonstrate the effects of even a relatively limited biological attack.[8] A smallpox attack would be orders of magnitude worse. Experts consider smallpox a terrorist’s “ultimate weapon” and an “ideal tool for deviant non-state actors,” and terrorist groups have expressed an interest in obtaining and weaponizing the disease.[9] But if they cannot obtain smallpox from heavily guarded research facilities, where might a terrorist group get it?

The recent proliferation of biological engineering knowledge is the most likely pathway for a terrorist group to obtain, or even create, a sample of the smallpox virus. In 2005, researchers at the Centers for Disease Control (CDC) successfully re-created the spectacularly deadly Spanish Influenza, a disease that killed upwards of fifty million people during a pandemic outbreak in 1918.[10] The CDC researchers sequenced the virus’ genome, identifying the precise genetic characteristics that made it so deadly.[11] The results of the research were made publicly available in Science and Nature, two leading academic journals.[12] The CDC noted that publication was done in consultation with the National Science Advisory Board for Biodefense (NSABB) and was necessary for the improvement of “diagnostic tests, treatments, and preventative measures.”[13]

Publication of the Spanish Influenza genome raises a crucial question: could the publication of the genomes of dangerous viruses be used by terrorist groups to build a virus from scratch? The answer is almost certainly yes, and it is only a matter of time. The methods of artificially replicating living organisms are widely known and easy to learn; basic courses in genetics are now common requirements of undergraduate biology programs. The Massachusetts Institute of Technology even sponsors an annual competition that challenges undergraduate students to build living organisms from kits of standardized DNA.[14] The cost of sequencing and combining DNA is falling rapidly; “[t]he cost of sequencing DNA has fallen from about $1 per base pair in the mid-1990s to a tenth of a cent today, and the cost of synthesizing the molecule has also fallen.”[15] Even the equipment needed to perform genetic manipulation is available off the shelf or can be constructed from readily available materials.[16]

The ominous result is that any individual who possesses some basic knowledge and has access to basic equipment can potentially re-create a virus, provided that the genetic information is available. This is a small concern in the context of diseases like the Spanish Influenza virus because it is susceptible to common flu treatments.[17] However, if a terrorist group acquired the genetic information for smallpox, the consequences would be disastrous.

One potential method of preventing a terrorist group from acquiring the smallpox virus is to allow the government to prevent outright the publication of dangerous information that has the potential for abuse, such as the smallpox genome. Prior restraints of the press are barred under the U.S. Constitution, but there is precedent for enjoining publication of highly dangerous information that threatens national security. In the classic case of United States v. Progressive, Inc., which promulgated the so-called born secret doctrine—the concept that some information is so dangerous that it must be kept secret from its inception—the court observed that national security and academic freedom sometimes intersect, and one must give way to the other. “[I]t is possible for example, that a technology such as recombinant DNA could someday surface means of destruction that ought not be published, while at the same time, provoking crucial issues of public policy that badly need to be discussed.”[18] That day has come.

Should publication of potentially dangerous biological research be restricted, and if so, how should this be accomplished? This Article examines two models of partial restriction on publication, one based on self-governance by scientists and professional organizations, and the other modeled on the Invention Secrecy Act, which governs patents for inventions with national security implications. Each system deals in a different way with the tensions between national security and dissemination of information. On one side are the speech rights of scientists, academic freedom, scientific innovation, and the spread of knowledge. On the other is the obligation of the government to protect the nation from threats to its safety. The needs of both sides intersect at the point where scientific information can be used to threaten the health and welfare of a nation.

II. Voluntary Review

In 2004, the National Academy of Sciences (NAS)[19] published a report that confronted the potential of “dual use” research techniques which could be used as the basis of a bio-terror attack.[20] NAS recognized the potential for the misuse of life science research, noting “the capacity for advanced biological research activities to cause disruption or harm, potentially on a catastrophic scale.”[21] NSA noted that in particular the danger comes from “(1) the risk that dangerous agents that are the subject of research will be stolen or diverted for malevolent purposes; and (2) the risk that research results, knowledge, or techniques could facilitate the creation of ‘novel’ pathogens with unique properties or create entirely new classes of threat agents.”[22]

NAS proposed a system of prepublication review of experiments and their results, with the goal of “provid[ing] reassurance that advances in biotechnology with potential applications for bioterrorism or biological weapons development receive responsible oversight.”[23] The proposed system is based on seven recommendations by the review committee: (1) education of scientists “about the nature of the dual use dilemma in biotechnology and their responsibilities to mitigate its risks”; (2) implementation of a review system for various classes of experiments with the potential for misuse; (3) voluntary self-review of publications by scientists and journals for “potential national security risks”; (4) creation of a National Science Advisory Board for Biodefense (NSABB) to oversee the review system; (5) the use of only the current statutory framework for the regulation of biological materials and research personnel; (6) “new channels of sustained communication” between the “life sciences community” and national security policy makers; and (7) “harmonized international oversight” in the form of an International Forum on Biosecurity.[24]

The bottom line of the proposed system is self-governance, with little to no interference by the government in publication decisions. When discussing the possibility of prepublication review, the committee categorically rejected government review, instead recommending reliance on “self-governance by scientists and scientific journals to review publications for their potential national security risks.”[25] This type of system would lack any centralized reviewing authority, essentially making individual scientists and editorial boards responsible for the review of their publications. This reaction is understandable given the antipathy shown by the scientific community towards any attempt to regulate scientific publication. The NAS report concisely summarizes the sentiment of scientists:

Proposals to limit publication have caused great concern and controversy among both scientists and publishers. The norm of open communication is one of the most powerful in science. To limit the information available in the methods section of journal articles would violate the norm that all experimental results should be open to challenge by others.[26]

The benefits of a system of voluntary review are twofold. First, it would certainly be constitutional because there would be no government action infringing upon the rights of scientists to publish the results of their experiments. The role of the government would be limited to an advisory capacity through the NSABB, and the government would lack the power to prevent publication. The NSABB’s role would be limited to promulgating voluntary, unenforceable standards for best practices in the scientific publishing world. Professional organizations and individual scientists would bear the burden of reviewing their own work for national security implications, and presumably would be under only a moral obligation to redact potentially harmful information from their published materials. Professional sanctions could also be imposed against scientists who breached the professional obligations of review without implicating First Amendment freedoms, because there would be no government imprimatur on the sanction or publication denial.

Second, the scientists themselves would be involved in developing and implementing professional standards of review for dangerous publications. This would presumably encourage scientists to comply with the review process because they would see the system as part of their obligations as a member of their professional organization. This type of compliance has been seen before, notably in the long-standing agreement by the American Council of Education to submit cryptography research for prepublication review.[27] Although the academic cryptography research community is significantly smaller than the broader life sciences community, the success of this type of self-regulation is an encouraging sign. Yet NAS specifically rejected applying this model to biosciences publications, arguing that the significantly higher number of papers in the field makes prepublication review impractical.[28]

Still, there is a significant downside to a self-governing system: lack of an identifiable compliance mechanism. Self-governance is appealing to scientists because it precludes government interference in publication decisions, but this must be balanced against the strong national security interest in preventing the use of bioscience research in a malicious attack. This is not a negligible possibility, and NAS itself recognizes the danger of allowing such methods to be published: “not to [limit publication of methods] is potentially to provide important information to biowarfare programs in other countries or to terrorist groups.”[29] The quandary of the “publisher’s veto”—the power that an author or publisher has to preempt or moot an attempted prior restraint by simply publishing—is a significant concern in an age in which publication is as simple as a blog posting on the Internet. Without some system of regulation, backed by enforceable criminal or civil penalties, there is a significant risk that a publisher who simply disagrees with a restriction recommendation will publish highly dangerous material. Once the material is published, there would be little if anything that could be done under the system proposed by NAS.

Moreover, it is the government itself that is likely the most informed about precisely which research areas are most vulnerable to misuse, as it has extensive access to intelligence and information not widely available to the academic world. While there is no room for simply allowing the government to bar publication without explanation, neither can the government’s unique access to information and threat analyses be lightly discarded. A middle ground must be reached that acknowledges the interests of the scientific community by incorporating a bias toward publication, yet acknowledges the interests and expertise of the national security community by allowing for prepublication review.

The NAS self-government proposal leans too heavily in favor of an absolute right to publication. A completely self-regulating system that is not backed by some legal method of preventing publication of harmful material is insufficient to meet the threat posed by biological research.

III. The Invention Secrecy Act

In contrast to the relatively unregulated free-for-all of academic publication, regulation of intellectual property under the U.S. patent system incorporates an effective national security vetting process. Under the Invention Secrecy Act of 1951,[30] patent applications that impact national security may be subject to a “secrecy order” that prevents disclosure of the invention or information related to it while the order is in force. The Act balances the needs of inventors and national security by incorporating a review and appeal process for patents that are subject to a secrecy order.

In general, when an invention receives patent protection under the laws of the United States, the inventor is required to disclose publicly the manufacturing process used to make the invention.[31] While most inventions are reviewed by the Patent Office only for their qualifications as patentable inventions, the Invention Secrecy Act allows the Patent Office to withhold granting a patent for a limited time for an otherwise qualified invention by issuing a secrecy order, thus precluding public access to the design of the invention.[32] The secrecy order remains in effect for up to a year, although this period is renewable indefinitely.[33]

There are three types of secrecy orders, depending on the level of secrecy required. A “Secrecy Order and Permit for Filing in Certain Countries” allows for the filing of additional patent paperwork in friendly foreign countries but otherwise prohibits disclosure of the invention.[34] The next level of restriction is a “Secrecy Order and Permit for Disclosing Classified Information,” which treats the invention as if it was classified and only allows for dissemination of the underlying information to those with the appropriate security clearances.[35] The most restrictive order is simply a “Secrecy Order,” which precludes any disclosures without prior approval from the Patent Office.[36] Failure to comply with a secrecy order after notification of its issuance by the Patent Office carries hefty penalties—denial of the patent application at a minimum, or a maximum of two years incarceration and a $10,000 fine.[37]

The Invention Secrecy Act’s secrecy order regime may be a useful model for an effective control scheme for biological research. A statute could easily be constructed along the lines of the Act that would require the submission of all proposed publications to a board or agency that would review the work and be empowered to issue secrecy orders of varying severity depending on the content of the proposed publication.[38] The incorporation of criminal penalties for non-compliance with the secrecy orders could also track the provisions in the Invention Secrecy Act.

A statute similar to the Invention Secrecy Act has some advantages over a system of self-governance. First, establishing a legal duty to submit proposed biotechnology research articles for prepublication review ensures that the review agency has the “teeth” to compel review, which is a significant deficiency in a self-governance system. Without a legally enforceable mandate to ensure review, there is no guarantee that biotechnology researchers or journals will comply with reviewing standards and remove sensitive data or methods from their publications. From the perspective of the national security community, one of the principal deficiencies of a voluntary review system is that it is voluntary. When dealing with threats to national security, noncompliance is a serious threat to the integrity of the system, particularly when one is trying to prevent a potentially catastrophic terrorist attack.

Second, limiting the length of time that a secrecy order could be remain in effect addresses scientists’ concerns that government interference with publication would preclude their research from ever being published. In the “publish or perish” world of academia and the highly competitive pharmaceuticals market, this is a serious concern. By limiting the period of a secrecy order to a year (or less), the national security community will have sufficient time to review the potential impact of the research and develop any necessary countermeasures or preparedness plans. While admittedly the researchers will pay a price for the delay in publication, this may be a necessary sacrifice to ensure that the national security apparatus is appraised of and prepared for the impact of such research. Any unnecessary delay, however, implicates significant First Amendment concerns. In particular, a system that allows for an indefinite restraint on publication runs a serious risk of running afoul of the First Amendment’s prohibition on prior restraints of publication.

Third, the ability to halt publication will force the national security and biotechnology communities to come together and candidly discuss the implications of potentially dangerous information. A strongly worded statute that allows for the issuance of a secrecy order in narrow, clearly specified situations, and that incorporates a bias toward publication and a strong, open review process, would go a long way toward ensuring that information that is a threat to national security remains under wraps and any information that is scientifically necessary is published in a timely manner.

Despite the positive aspects of a biotechnology review statute modeled on the Invention Secrecy Act, there are significant problems as well. First and foremost, it is far from clear that the federal government has the constitutional power to enact a statute mandating prepublication review of all biotechnology-related materials. The Invention Secrecy Act is grounded in the undeniable power of Congress, clearly articulated in the Constitution, “[t]o promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries.”[39] This enumerated power is the basis of modern trademark and patent law, and Congress’ authority to regulate when and how patents are issued undergirds the secrecy orders of the Invention Secrecy Act. Fundamentally, no one is required to apply for a patent; an inventor is free to publicize an invention before the patent process is initiated. But if inventors want to gain the substantial benefits of patent protection, then they must play by the rules set down by Congress.

In contrast, there is no clear constitutional power that would allow Congress to require bioscientists to submit their publications for review to a federal agency. The government does not control private academic journals, nor does the government grant any benefits similar to patent protection to bioscientists who publish. Interference with the publication decisions of private academic journals and scientists is far outside of the scope of congressional power to regulate patents and trademarks, and it is therefore unclear whether a statute that restricted the publication of biotechnology research could be constitutionally enacted at all. Even if it were within the power of the federal government, such as statute would implicate significant free speech concerns and thus be subject to strict scrutiny. While it is possible that a statute could survive strict scrutiny because of the limited nature of the secrecy orders, that outcome is far from clear. At the very least, it is unlikely that indefinite secrecy orders would be upheld because such an order would amount to a prior restraint, which is clearly unconstitutional.[40]

A second and related concern is whether such a statute would ensure compliance with the review process. As already noted, the “publisher’s veto” is a compelling weapon against censorship in the information age; a scientist who disagrees with the publication restrictions could simply publish anonymously on the Internet. While this is somewhat unlikely given the professional need of academics to have their work attributed to them, it is a potential issue given that the national security objective of secrecy would necessitate one hundred percent compliance. The threat of criminal sanctions for noncompliance may diminish the danger, but the danger of unreviewed publication is still there. After-the-fact criminal sanctions that may or may not deter future violations will not prevent dissemination of information once it has been published. It is not clear why additional criminal sanctions are a better deterrent than other alternatives.

Third, the rapidly evolving nature of biotechnology implies two additional problems: volume and scope. In its 2004 report, the NAS rejected the model of prepublication review because of the vast number of articles, researchers, journals, and individuals that comprise the life sciences community at large.[41] The sheer volume of publications that would need to be reviewed by a small federal agency would result in either insufficient attention given to each publication or long publication delays. The first result would compromise the national security objectives of review, and the second would adversely impact the spread of legitimate, nonthreatening scientific information. Both results therefore undermine the policy objectives of a review process and counsel against formal review.

Moreover, the question of the scope of the statute is also implicated by the vast array of biotechnology disciplines and applications, more of which are discovered each day. A narrowly defined statute that requires review of only a limited subject matter—for example, genetic modification of certain viruses—would overlook other potentially dangerous areas of bioscience that remain undiscovered or do not have a known malicious use. Yet a broadly drawn statute risks unnecessarily requiring review of harmless research areas and thus needlessly incurring additional administrative and scientific costs.

Finally, there is the classic problem of restricting speech: the chilling effect such a restriction would have on scientific innovation. Revenues for sales of industrial biotechnology products alone accounted for about $140 billion in revenue in 2007,[42] and the decision of whether to invest in promising biotechnology is a crucial one in the business world. Leaders of modern pharmaceutical companies assert that biotechnology is the key to “‘a brand new revolution’ in personalized medicine,” but the “fundamental question is whether it is still worthwhile to invest in pharmaceutical science.”[43] Imposing new restrictions and long waiting periods for review on the publication of advances in biotechnology would severely hamper the spread of new innovations in the field. While this may not directly impact pharmaceutical giants with in-house research labs that do not publish, a lengthy review process would increase the time between the discovery of an innovation and its exploitation or use by other researchers in the field. Moreover, the possibility of criminal penalties for failing to comply with the review process may deter young and highly skilled scientists from entering the field in the first place, creating a long term brain drain out of biosciences. In a rapidly evolving field that has the potential to revolutionize medicine, agriculture, and a plethora of other vital areas, the costs of such a system may greatly outweigh the benefits. This is particularly so when one considers the fact that a catastrophic biological attack has never occurred and the risk of one is perceived to be low.[44]

Although the Invention Secrecy Act is an excellent and effective system for limiting the publication of dangerous patents, it appears to be economically inefficient model for restricting biotechnology publications because its costs seem to be outweighed by its potential benefits. Even aside from the general constitutional concerns, administrative and opportunity costs may be prohibitively high.

IV. Conclusion

In the end, what can be done about limiting the spread of information such as the smallpox genome that could potentially be used to launch a devastating terrorist attack? A limited compulsory review process faces significant challenges, not the least of them constitutional objections, hostility of scientists and professional organizations, and high social costs.

Preventing terrorist access to dangerous research by limiting publication can possibly be done, but perhaps it should not be. The lack of adverse results from the 2005 publication of the Spanish Influenza virus indicates that concerns about this aspect of bioterrorism may be misplaced, and the institution of the NSABB and the awareness of the risks within professional organizations may limit the danger of improper usage of biological research. Moreover, the spread of low-cost research materials and equipment has led to the rise of a large number of amateur bioscientists, leading to a situation not unlike the state of the computer industry in its early days.[45] At the time, “there [were] no shortage of fools and criminals” who abused the new technology to cause harm. But over time the industry self-regulated to a remarkable extent and the long-term benefits of the technology are clear. Had the danger of misuse of nascent computer technology prompted widespread regulation, it is unlikely that the industry would have progressed as quickly as it did.

An additional concern is that prohibitions on publication may “have much the same effect as laws banning gun ownership—ordinary citizens will be discouraged, while criminals will still find what they want on black markets.”[46] This is not the result that should be desired, particularly in a field as promising as biotechnology. “The quandary we face is that we need the garage hackers, because that’s where the innovation is.”[47] The computer industry would certainly not be where it is today if computer technology were restricted to only approved government scientists; many highly successful companies such as Google and Hewlett-Packard had their beginnings in the garages of hobbyists.[48]

As a policy tool, controlling the publication of biosciences materials through a mandatory review process, backed by criminal sanctions, is too heavy handed to be a desirable choice. The costs are simply too high when weighed against the unclear threat of biological terrorism and the uncertain protection against it that publication bans would engender. In conjunction with professional voluntary review systems, other policy tools such as the use of law enforcement and intelligence resources that directly target those individuals and groups who are threats, are the appropriate response to the threat of bioterrorism. The restriction of legitimate scientific information is not.


*J.D. Candidate, 2010, DePaul University College of Law.

[1] David A. Koplow, That Wonderful Year: Smallpox, Genetic Engineering, and Bio-Terrorism, 62 Md. L. Rev. 417, 418–23 (2003).

[2] Id. at 426.

[3] Id.at 423.

[4] Id. at 429-31.

[5] Id. at 435-36.

[6] Id. at 438 n.139. The two locations are the Centers for Disease Control in Atlanta, Georgia and the Ivanovsky Institute for Viral Preparations in Moscow.

[7] Id. at 445.

[8] See id. at 468-69.

[9] Id.at 469-70.

[10] Press Release, Centers for Disease Control, Researchers Reconstruct 1918 Pandemic Influenza Virus; Effort Designed to Advance Preparedness (Oct. 5, 2005), available athttp://www.cdc.gov/media/pressrel/r051005.htm.

[11] Id.

[12] Id.

[13] Id.

[14] Hacking Goes Squishy, Economist, Sep. 3, 2009, at 30.

[15] Id.

[16] Id.As part of the well-known International Genetically Engineered Machine (iGem) competition, undergraduate teams can “spend a summer building an organism from a ‘kit’” of “standardized chunks of DNA known as BioBricks.” Id.; see also The BioBricks Foundation, http://bbf.openwetware.org (last visited Apr. 5, 2010) (providing an overview of the BioBrick concept and links to further information). The tools needed to manipulate DNA and create organisms are surprisingly accessible and are not prohibitively costly. New England BioLabs, a provider of biotechnology materials, sells a basic “BioBricks Assembly Kit” for only $235. See New England BioLabs Inc.,http://www.neb.com/nebecomm/products/productE0546.asp (last visited Mar. 25, 2010). The equipment needed to perform DNA recombination and analysis is also relatively simple to construct or purchase off the shelf. See Hacking Goes Squishy, supra note 14 (noting that a gel-electrophoresis box, “a basic tool used in a wide range of experiments,” is in essence merely a “few panes of colored plastic over a heating element”).

[17] See Press Release, Centers for Disease Control, supra note 10.

[18] United States v. Progressive, 467 F. Supp. 990, 997 (D. Wis. 1979).

[19] The National Academy of Sciences is a semi-public advisory body that is tasked with providing “advice on the scientific and technological issues that frequently pervade policy decisions.” National Academy of Sciences, About the NAS, http://www.nasonline.org/site/PageServer?pagename=ABOUT_main_page (last visited Mar. 25, 2010).

[20]See Nat’l Research Council, Biotechnology Research in an Age of Terrorism (2004), available athttp://www.nap.edu/catalog/10827.html. The term “dual use” refers to technology that can be used for either benign or malignant purposes. One recent example is the investigation into the shipment of industrial turbo compressors to Iran by a German company, which could be used for Iran’s missile program. See Judy Dempsey, In Response to Iran’s Nuclear Program, German Firms Are Slowly Pulling Out, N.Y. Times, Feb. 3, 2010, at B6.

[21]Nat’l Research Council at 1.

[22] Id.

[23] Id. at 3.

[24] Id. at 4-12.

[25] Id. at 7.

[26] Id. at 8.

[27] See Brian J. Gorman, Balancing National Security and Open Science: A Proposal for Due Process Vetting, 7 Yale J.L. & Tech. 59 (2005). Unlike many other scientific disciplines, cryptography—the study of codes and ciphers—is intimately related to protection of secrets and national security. Indeed, the whole reason for the existence of the National Security Agency (NSA) is to protect U.S. communications and to decipher foreign communications. See National Security Agency, About NSA, http://www.nsa.gov/about/index.shtml (last visited Mar. 25, 2010).

[28] Nat’l Research Council, supra note 20, at 85.

[29] Id. at 25.

[30] 35 U.S.C. §§ 181-88 (2008). For an overview of the Invention Secrecy Act and its possible application to patentable biotechnology, see generally James W. Parrett, Jr., A Proactive Solution to the Inherent Dangers of Biotechnology: Using the Invention Secrecy Act to Restrict Disclosure of Threatening Biotechnology Patents, 26 Wm. & Mary Envtl. L. & Pol’y Rev. 145 (2001).

[31] See Parrett, supra note 30, at 163-64.

[32] Id. at 167 (citing 35 U.S.C. §§ 181-88). While secrecy orders are issued by the Patent Office, government agencies are responsible for monitoring patent applications and requesting secrecy orders for those patents that implicate national security. See Manual of Patent Examining Procedure § 115 (8th ed. Aug. 2001) (last revised July 2008), available at http://www.uspto.gov/web/offices/pac/mpep/index.htm (last visited Mar. 25, 2010). After secrecy orders have been imposed, specialized Technology Center Working Groups examine the patent application. See id. § 130.

[33] See Parrett, supra note 30, at 170.

[34] Id. at 169.

[35] Id. at 169-70.

[36] Id. at 170.

[37] See 35 U.S.C. § 186, cited in Parrett, supra note 30, at 170.

[38] One option is to use the NSABB as the body responsible for review and issuance of secrecy orders. This would have the advantage of avoiding the costs, in money and institutional knowledge, of starting a new agency from scratch.

[39] U.S. Const. art I, § 8, cl. 8.

[40] This argument is developed further in James G. Vanzant, The Born Secret Doctrine: National Security, Prior Restraints, and Scientific Publication (Mar. 2010) (unpublished manuscript on file with author).

[41] Nat’l Research Council, supra note 20, at 85.

[42] Third Time Lucky, Economist, June 4, 2009.

[43] Back to the Lab, Economist, Dec. 10, 2009.

[44] This is especially so when compared to the much higher likelihood of a low-grade, low-casualty attack like the 2001 anthrax attacks. In contrast to smallpox—which is highly contagious, difficult to create, obtain, and disseminate, and nearly impossible to control once delivered—agents such as anthrax are easily available and readily targeted. Consequently, while a smallpox attack is “a potentialcontingency . . . the probability of occurrence is very low.” Michael J. Powers & Jonathan Ban, Bridge, Mar. 2002, at 29, 30, available at http://www.nae.diamax.com/File.aspx?id=7345 (last visited Mar. 30, 2010).

[45] See Jon Mooallem, Do-It-Yourself Genetic Engineering, N.Y. Times Mag., February 14, 2010, at 40;Hacking Goes Squishy, supra note 14.

[46] Hacking Goes Squishy, supra note 14.

[47] Id. (quoting Dr. Rob Carlson, founder of Biodesic).

[48] Id.

Old Paper by ThunderThemes.net