Abstract
The SynBioSecurity argument says that synthetic biology introduces new risks of intentional misuse of synthetic pathogens and that, therefore, there is a need for extra regulations and oversight. This paper provides an analysis of the argument, sets forth a new version of it, and identifies three developments that raise biosecurity risks compared to the situation earlier. The developments include (1) a spread of the required know-how, (2) improved availability of the techniques, instruments and biological parts, and (3) new technical possibilities such as “resurrecting” disappeared pathogens. It is first shown that the general argument from SynBioSecurity needs to be qualified and that many improvements to biosecurity have already been implemented, most notably in the United States. Second, I suggest a new strain of the argument: the situation that most branches of synthetic biology fall under the gene technology regulation in the European Union and that this regulation in its current form does not adequately address SynBioSecurity risks together provide a weighty reason to review and possibly refine the legislation as well as the supervisory practices. Ethically speaking, the rise in the relative risk of bioterrorism brings to the fore new extrinsic issues.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Three Theses
Synthetic biology (henceforth, SynBio) refers to a fast-developing multidisciplinary field in which engineering-based modelling and building are applied to biology. The European Union (EU) does not have specific SynBio regulation, but many laws and guidelines also concern the research and commercial use of SynBio. Notably, most branches of SynBio fall under the gene technology legislation. Its most central directives are the Directive 2009 /41/EC of the European Parliament and of the Council of 6 May 2009 on the contained use of genetically modified micro-organisms and the Directive 2001 /18/EC of the European Parliament and of the Council of 12 March 2001 on the deliberate release into the environment of genetically modified organisms and repealing Council Directive 90/220/EEC.
The SynBioSecurity argument, simply put, says that SynBio introduces new risks of design, construction and use of synthetic pathogens for malicious purposes. Therefore, there is a need for extra regulation and oversight. In what follows, I will consider this general argument in the form it has typically been presented in the relevant literature,Footnote 1 and suggest a new version of the argument, specifically targeted to the European context. This paper puts forward three main propositions: First, three developments related to SynBio and genome editing raise biosecurity risks compared to the situation earlier. The developments include (1) a spread of the required know-how, (2) better availability of the techniques, instruments and biological parts, and (3) new technical possibilities such as “resurrecting” disappeared pathogens.
Second, most branches of SynBio fall under gene technology regulation in the EU and this regulation in its current form does not adequately address SynBioSecurity risks. This situation provides a weighty reason to review and possibly refine the legislation as well as the supervisory practices. Notwithstanding, a recent extensive review of SynBio and the related possible regulatory gaps resulting in three opinion pieces by three non-food related Scientific Committees in the EU (Scientific Committee on Emerging and Newly Identified Health Risks, Scientific Committee on Health and Environmental Risks, and Scientific Committee on Consumer Safety) did not address biosecurity. Instead, the review focuses on SynBioSafety, meaning avoidance of the possible unintentional harms (Scientific Committees 2014, 2015a, b).
Third, while ethical questions that are highly similar to those of SynBio have been extensively discussed before, the rise in the relative risk of bioterrorism calls for biosecurity considerations that are new. The pressing extrinsic issue is how to assess and manage situations where there are possible but difficult-to-quantify harms and possible rogue individual or groups’ actions that are difficult to supervise.
I will begin with remarks on the demarcation of SynBio, its main branches and its potential (applications), after which I will briefly map out intrinsic and extrinsic concerns in this area. Following this I will analyse the SynBioSecurity argument and draw some comparisons to traditional genetic engineering and especially to genome editing.
Background
Demarcation and Potential of Synthetic Biology
There is no single generally agreed definition of SynBio, but a plethora of definitions has been formulated, in part reflecting the multidisciplinary nature of the field. Specifically, SynBio combines molecular biology, genetics, chemistry, physics, computation/information technology (IT) and engineering. The three Scientific Committees’ Opinion on Synthetic Biology Biology I: Definition (2014) surveys 35 definitions. These definitions typically involve two aspects. The first one is redesigning natural living systems to fulfil specific purposes, for example, to produce drugs (e.g. artemisinic acid, a precursor for an anti-malarial medicine artemisinin) or biofuel (isobutanol) in yeast, algae, or bacteria. Microbes are modified and to some extent constructed to function as living chemical factories. The second aspect is constructing new kinds of living (and xenobiological) systems and their parts, such as alternatives to the natural nucleic acids. These are not only unprecedented in nature, but take life back to its basics and also to its limits.
From the outset it is important to note that SynBio overlaps with both traditional genetic engineering (in which Agrobacterium tumefaciens -mediated transfer and the gene gun are being used) and genome editing techniques, such as CRISPR-Cas9 (Clustered Regularly Interspaced Short Palindromic Repeats), ODM (Oligonucleotide Directed MutagenesisFootnote 2), TALEN (Transcription Activator-like Effector Nucleases), and ZFN (Zinc Finger Nucleases) (for the techniques, see e.g. Lusser et al. 2012; Hsu et al. 2014). This overlap is encapsulated in the conclusion the Scientific Committees state in their report, entitled Opinion on Synthetic Biology II: Risk Assessment Methodologies and Safety Aspects: “it is difficult to accurately define the relationship between genetic modification and SynBio on the basis of quantifiable and currently measurable inclusion and exclusion criteria” (Scientific Committees 2015, 64).
SynBio involves a wide spectrum of research activities and projects slightly differently grouped by different authors. The Scientific Committees identify six branches of SynBio. They are:
-
1.
Genetic part libraries and methods (where the first-mentioned refers to genes or fragments of DNA with well-characterised properties and functions).
-
2.
Minimal cells (including only the genes without which a cell cannot survive even in ideal conditions) and designer chassis.
-
3.
Protocells and artificial cells (where the first-mentioned denotes non-living self-organised, able-to-replicate constructs that may help us to better understand the origin of life).
-
4.
Xenobiology (constructing non-canonical forms of biochemistries and new genetic codes, such as the XNA [xeno nucleic acid in which a non-ADGCU nucleotide is used]).
-
5.
DNA synthesis and genome editing (the latter equals the new techniques).
-
6.
Citizen science (Do-It-Yourself biology [DIYbio] which has also often been called biohacking) (Scientific Committees 2015; additions in brackets this author’s).
In their paper “A Brief History of Synthetic Biology”, Cameron et al. (2014) first discuss the origins of the field between 1961 and 1999 and then proceed to identify three distinct periods or phases of SynBio: (I) the foundational years 2000–2003, (II) expansion and growing pains 2004–2007 (see also Kwok 2010), and (III) increase in pace and scale 2008–2013 during which several development steps or breakthroughs took place. SynBio is considered to hold substantial promise for a number of practical applications in a variety of fields such as biotechnology, medicine, energy production, industrial chemistry, material technology and bioremediation (see e.g. Church et al. 2014; see also Scientific Committees 2015, 13–14).
This fast development and its promises have been accompanied with an emphasis that SynBio raises a welter of ethical concerns, which the expert community has been proactive in addressing both in academic research, starting already in 1999 by Cho et al. (1999), and in different kinds of governmental and independent bioethics centres’ reports (e.g. Presidential Commission 2010; Parens et al. 2009; EGE 2009).
Ethical Arguments in the SynBio Debate
As in other fields of biotechnology, it has become customary to group the ethical concerns into two categories (for SynBio, see e.g. Garfinkle and Knowles 2014; for genetic engineering of plants and animals, see e.g. Bovenkerk 2012).Footnote 3 Intrinsic concerns embody the idea that research and the practical applications of SynBio are morally questionable because of some feature of (the use of) the technology in itself, irrespective of their consequences. Questions of this type include, for example, the following: Does constructing new life forms cross the (alleged) moral strictures of playing God, unnaturalness or human hubris (for analysis, see e.g. Lustig 2013; Heavey 2013).
According to extrinsic concerns, research and the practical applications of SynBio are morally questionable because of their known, predicted or possible consequences. Does constructing new kinds of organisms and species change the way we perceive nature and ourselves? Or does it result in the misjudgement of the status of synthetic organisms? (Douglas and Savulescu 2013). These issues draw on the so-called slippery slope argument. Does the use of SynBio result in unjust distributions in society, for example, in the form of expensive treatments available only for the privileged few at the expense of the general health care of the many?
Extrinsic concerns also involve worries about possible harmful consequences to human health, animals and the environment (Smith 2013). These have, in fact, received the most attention. Here it has become standard to talk about the management of two kinds of risks. On the one hand, biosafety refers to principles, practices and specific actions to prevent possible unintended and unexpected consequences. Laboratory facility requirements and protection measures in relation to four classes (i.e. risk groups) of pathogenic microorganisms provide an example. On the other hand, biosecurity refers to principles, practices and specific actions to prevent the use of SynBio for malicious purposes.Footnote 4 These kinds of risks form a continuum ranging from mere bionuisance to bioterrorism and to biological war.
The SynBioSecurity Argument
SynBioSafety
SynBioSafety is mainly concerned with lab safety and, in the future, also with the deliberative release of synthetic organisms into the environment. While the former basically and for most part relates to the research personnel, the object of the latter–and in severe accidents also the former–is the general public in the vicinity of the company and research sites (such as field trials locations), and the environment. Risks pertaining to the deliberate release may follow, for example, from the interaction of synthetic organisms with nature and, in the case of reproductive organisms, from evolution.
In their report on risk assessment methodologies and safety aspects, the Scientific Committees (2015a) conclude that although the current gene technology regulation and oversight in the EU are otherwise covering, bionanoscience (i.e. focusing on the nano scale phenomena of biological or similar structures or materials) and protocell development remain outside its scope. Some remarks are, however, in order. First, it has been suggested that minor revisions to the current regulations are not enough. Markus Schmidt argues that SynBio challenges the current biosafety framework. In his words,
[t]his knowledge gap can be closed by applying adequate and up-to-date biosafety risk assessment tools, which–in their majority–have yet to be developed for the major subfields of synthetic biology (DNA-based biological circuits, minimal genomes, protocells and unnatural biochemical systems). Avoiding risk is one part, the other one should be to make biotechnology even safer. (Schmidt 2009, 81).
Second and more specifically regarding the SynBio risk assessment, a natural comparator is not always available, as part of SynBio is concerned with new kinds of biological systems and pre-life forms. In other words, it will be more difficult–or even impossible–to find natural comparators than it has been in regard to genetic engineering. (Scientific Committees 2015a).
Third, owing to the use of ever-better techniques to conduct genome editing and synthesis of DNA, the number of research and commercial projects involving genetic modifications or synthetic DNA will most probably increase dramatically. This challenges the case-by-case evaluation in the EU (see e.g. Scientific Committees 2015a). The current bureaucratic and time-consuming approval process may simply not function in the new situation. There currently (11/2016) also remains legal uncertainty about whether genome editing techniques fall under the gene technology regulation in the EU in the first place. The European Commission is expected to take a stance on this in the near future, but some national competent authorities (e.g. in Finland and Sweden) have already had to make decisions on this in the case of particular scientific research projects.
Fourth, generally speaking (lab) accidents can happen and also do sometimes happen (see e.g. Kaiser 2015; Weiss et al. 2015; Cressey 2007). The legitimate research on pathogenic organisms imposes risks of inadvertent harm to the research personnel, the general public living in the vicinity of the labs, and the environment including animal health. This is despite covering regulations and practices embodied in biological agents’ risk groups (I–IV) and lab safety standards (biosafety levels 1–4) inluding risk assessment, cleaning and waste treatment practices, compulsory notifications, and accidents and dangerous situations reports.Footnote 5 Furthermore, members of the biohacking community may not always be familiar with the due biosafety procedures (see e.g. Ahteensuu and Blockus 2016).
SynBioSecurity
In regard to biosecurity, the regulatory framework of gene technology in the EU does not seem to guarantee a sufficient level of safety, at least not on its own and in its current form. I will next reconstruct and evaluate the SynBioSecurity argument. It is my intention to state the general argument in a form as convincing as possible in order both to avoid refuting a strawman and to reveal limits to the argument. This may be thought of as applying a principle of charity in interpretation (Table 1).
Assessment
PREMISE1. Premise1 is concerned with the possibility of bioterrorists, which could mean lone-wolfs, groups of people or state-actors, constructing or otherwise getting hold of synthetic pathogens. The premise has been questioned in the literature (mainly regarding groups of people and lone-wolfs), but only partially as I will argue below. It is true that most of the techniques of SynBio require substantial research resources (i.e. equipment and know-how) and tacit knowledge (Jefferson et al. 2014). Constructing synthetic pathogens outside institutionalised research laboratories is very difficult. Michele Garfinkle and Lori Knowles explain,
[s]pecialists in viral microbiology doubt whether it is as easy to synthesize a deadly virus as one might believe (Collett 2007). In order to synthesize an existing virus, its exact genetic sequence must be known, and to be functional, the sequence must be entirely correct. Some of the viral strains in laboratories are attenuated through spontaneous mutations, and may no longer be transmissible or pathogenic even if they were at the time they were sequenced (Baric 2007). Moreover, even if a correct sequence for a virus exists, it still requires significant expertise to construct a virus from synthesized DNA and then to express the virus so that it functions as a bioweapon (NSABB 2006). (Garfinkle and Knowles 2014, 536–537).
Although Garfinkle and Knowles’ paper is relatively new, their references are older. Genome editing technologies have developed at an impressive pace in the recent years. CRISPR-Cas9 has, arguably, already revolutionised the field and was selected as the breakthrough of the year 2015 by the journal Science. The first use of CRISPR-Cas9 was reported only a few years earlier. In addition to the techniques becoming easier to use and more and more precise, statistics indicate that DNA sequencing, DNA synthesis and genome editing have become cheaper, the first mentioned even at a logarithmic rate. Their costs may still drop, although at a slower pace, in the near future (Oye 2012, esp. 3; see also Cameron et al. 2014).
Designing and constructing complex lethal pathogens is possible on the basis of the current technological know-how. Within (basic) research conducted by academic community such studies have been carried out. Cello et al. (2002) report that they produced de novo polio virus in the laboratory. Tumpey et al. (2005), in their turn, reconstructed the 1918 influenza virus (also known as Spanish flu), which killed by estimation of 20–100 million people in 1918–20. To get an impression of the pace of the development, Sissonen et al. (2012b) note that in 2002 it took two years of research for a research group to construct the polio virus, but a few years later, it took only two weeks to construct a slightly smaller bacteriophage (see also Kelle 2009a). There are other pathogens with substantially less complex genomes than polio virus, which is some 7500 nucleotides long.
A relatively wide discussion arose on whether it is acceptable to publish the studies on the polio virus and Spanish flu, as they include specific information about the synthesis of these pathogens [for research ethical discussion, see e.g. Douglas and Savulescu 2010 and a reply to them by Pierce (2012)]. Many academic journals, in fact, nowadays pre-review submitted research manuscripts which are security-sensitive, but there are some difficulties with these review practices. Garfinkle and Knowles (2014, 537) mention the following:
it can be difficult to identify a priori which research findings entail dual-use risks (…) scientific freedom and access to information are crucial to technological innovation and (…) restricting publication would slow the development of medical countermeasures against biological threats.Footnote 6
Related to this, Oye (2012, 4) points to the fact that “sequenced genomes are available in the public domain on the internet through GenBank (USA), EMBL (Britain), and DDBJ (Japan), which share and exchange sequence information on a daily basis”.
In regard to new features, such as higher virulence, there have been unintended instances in research community, again part of academic research, not bioterrorism. An Australian research group (Jackson et al. 2001) managed to increase, by accident, virulence of a mousepox virus, which is a close relative to smallpox, by adding interleukin-4 (IL-4) gene to the virus. Imai et al. (2012), in their turn, report a study where they managed to modify the highly pathogenic H5 HA influenza virus to be transmittable between mammals (see also Herfst et al. 2012; for critical discussion, see Jefferson et al. 2014, 9–10).
On this basis, the possibility that Premise1 presents cannot be rejected.Footnote 7 It has typically been specified to refer to the near future (for example Kelle 2009a, esp. S23; see also Mukunda et al. 2009), but given the continuing development in the recent years, it seems probable that the future in question has already actualised. It is worth noting that the examples mentioned above actually belong to the sphere of gene techniques, not SynBio. Moreover, the discussion has thus far centred on human health leaving intentional harm to animal health, food crops and the environment unaddressed (although some scholars have admittedly mentioned them in passing).
PREMISE2. Are the risks higher than before or different from gene technology? Quantification of the biosecurity risks of SynBio seems difficult. The possible new features complicate assessing the magnitude of the possible damage. Eliciting probabilities is complicated by the fact that there is almost no frequency-based evidence available despite the anthrax attacks in the aftermath of 9/11 (and two other confirmed uses of biological agents against humans in terrorist attacks) (Jefferson et al. 2014; see also Mukunda et al. 2009, 2–3). Analogical reasoning based on other technologies and regulatory contexts may not be reliable enough. The risks can be different from the traditional biological weapons. Furthermore, the governance and especially the surveillance of biosecurity risks of SynBio can be highly challenging. In the US, the Federal Bureau of Investigation (FBI) is active on this in regard to Do-It-Yourself biology (see e.g. Ahteensuu and Blockus 2016).
Bügl et al. (2007) argue that DNA synthesis challenges the safety framework of gene technology in the following two ways:
First, synthesis allows the physical decoupling of the design of engineered genetic material from the actual construction and resulting use of the material; DNA can be readily designed in one location, constructed in a second location and delivered to a third. Second, synthesis might provide an effective alternative route for those who would seek to obtain specific pathogens for the purpose of causing harm. Today such pathogens include the following: first, those for which the natural reservoirs remain unknown or that are otherwise difficult or dangerous to obtain from nature (e.g. Ebola virus); second, those that are physically under lock and key in a very small number of facilities (e.g. smallpox virus); and third, those that no longer exist in nature (e.g. 1918 influenza virus). (Bügl et al. 2007, 628; Italics added).
The physical decoupling presents a fundamental difference neither to traditional genetic engineering nor to genome editing. Research groups using all of these techniques participate in international collaboration, and materials and constructs move back and forth. Moreover, it seems to be so that at the moment it is still easier to misuse already existing pathogens by stealing them or getting hold of them from an outbreak in nature than by constructing them with the means of SynBio.
Kelle (2009a) presents an argument from history. According to him, the fact that several breakthroughs in biological sciences have been employed in military purposes in itself provides a sufficient reason to take SynBioSecurity seriously. As examples of the breakthroughs that have found their way to the development of biological weapons programmes in different countries, Kelle mentions bacteriology, aerobiology, virology and genetic engineering (see also Parens et al. 2009, esp. 20–2). While agreeing that history in life sciences and also more generally (see EEA 2001) provides a weighty reason for taking early precautions in the face of weak signals or indications of danger that have yet to be proven scientifically, it is far from obvious what this amounts to in practice and especially so in regard to SynBioSecurity.
What brought biosecurity into the spotlight in 2006 was The Guardian journalist who managed to make an online order of fragments of the smallpox genome which were then delivered to his residential address (Randerson 2006). Indeed, there currently are a number of commercial companies that use DNA synthesisers and fulfil orders for constructed genetic material ranging from oligonucleotides to full genomes (see e.g. Garfinkle and Knowles 2014). One thing that resulted from the media attention of the Guardian article and the following debate is that this should not be possible anymore. The biosecurity risks of SynBio are substantially lowered by the fact that both the research community and industry practice self-governance. Particularly, the companies that provide DNA synthesis conduct background checks of the orderers and the ordered sequences by comparing them to the sequence libraries of pathogenic substances in order to prevent bioterrorism-related orders from being processed. If the orderer or the sequence does not pass the check, then the order will not be completed. Software is available to compare orders against lists of agents of concern. Guidelines have been established by governmental authorities and by consortia of the companies themselves.Footnote 8
This, needless to say, only applies to responsible companies. In any case the emergence of these companies means totally new possibilities at least for scientists as emphasised by Garfinkle and Knowles (2014, 534): “[l]aboratory work that would take 6 months of full-time effort to combine pieces of DNA can now be essentially dispensed with by placing an order for the precise sequence required”. It is also that the orderers do not have to correspond to the end-users as pointed out by Bügl et al. (2007). Some scholars have proposed more extensive and tighter measures of self-governance and other practices, typically a hybrid approach combining self-governance with governmental (or independent) oversight practices (see e.g. Bügl et al. 2007; Kelle 2009a, b). As explicated by Garfinkle and Knowles (2014, 537), “[o]ther proposals for governance have suggested that DNA synthesizers, especially oligonucleotide synthesizers, might be registered, or users could be required to have licences before they are allowed to buy the chemicals required to make DNA”.
Although researchers use more and more of these DNA synthesis companies, shortish segments of DNA can be designed and constructed by themselves with “desktop” oligonucleotide synthesisers as well (ibid., 534). Related to this, it is, in fact, relatively easy to establish a basic home lab in one’s garage or kitchen. Guidance for setting it up can be found on the Internet and the standard lab equipment is available for purchase.
Biohackers have also developed creative workarounds (…) to replace standard laboratory equipment which is too expensive for personal use. These include, for example, a “self-made” microscope, a centrifuge, and a 37 degree Celsius incubator (…) The workarounds are often tens, even several hundred, times cheaper than the corresponding standard equipment and yet fulfill their purpose satisfactorily. (Ahteensuu and Blockus 2016, 20).
This in itself does not have anything to do with bioterrorism although biohacking has predominantly been framed as a biosafety and biosecurity issue–and unfortunately merely so (see ibid.; see also Jefferson et al. 2014).
CONCLUSIONRISK. The above considerations grant the modest conclusion that biosecurity risks are higher when compared to the previous situation in the field. To simplify, higher risk may follow from two things: from a rise in the probability of the event or from an increase in the severity of the event. Both seem to be the case here (even if the probability is difficult to estimate accurately).
The risk-level rise emerges as the unintended side effect of three clusters of developments, which include (1) a spread of the required know-how, (2) improved availability of the techniques, instruments and biological parts, and (3) new technical possibilities (cf. Oye 2012). The spread of the required know-how results from the ever-more common and wider use of the genome editing techniques among scientists and product developers as well as in educational events such as the iGEM (International Genetically Engineered Machine Foundation) competitions. An addition to this is the rapid growth of the DIYbio movement and the related community labs and hackerspaces. Having more and more people with these skills and knowledge is of course a good thing for the society and the individuals, but at the same time this simply makes it more probable than before that they will include persons with intentions to seriously harm others.
The improved availability of techniques, (physical and computational) tools and biological parts results from the falling price of DNA sequencing and synthesis as well as the ease of genome editing techniques; the emergence of the DNA oligonucleotide synthesis selling companies; accessible publications that charaterise the genome of deadly and (possibly) pandemic pathogens; establishment of genetic parts libraries (such as the Registry of Standard Biological Parts); and the guidance for setting up a home laboratory and standard lab equipment available on purchase on the Internet together with the available, cheaper workarounds to replace expensive lab equipment.
Third, the new technical possibilities include “resurrecting” disappeared pathogens, such as the Spanish Flu, and producing new kinds of pathogens with higher virulence and resistance to the known drugs. There may also be genuinely novel features that are unprecedented and unexpected. Mukunda et al. (2009, 20) call them wild card applications with consequences that are difficult, even impossible, to characterise or analyse beforehand.
While the biosecurity risks are higher than before, it is important to keep in mind that this is a statement about the relative, not absolute, risk level. Furthermore, several technical difficulties and logistical barriers substantially lower the risk. In their analysis of the SynBioSecurity risks, Catherine Jefferson, Filippa Lentzos and Claire Marris conclude that
any bioterrorism attack will most likely be one using a pathogen strain with less than optimal characteristics disseminated through crude delivery methods under imperfect conditions, and the potential casualties of such an attack are likely to be much lower than the mass casualty scenarios frequently portrayed (Jefferson et al. 2014, 12).
PREMISE3. Premise3 has sometimes been presented in a form that states that the research and commercial applications of SynBio should be prohibited because the risks are of a new kind, partly unknown or higher than before. For example, the Action Group on Erosion, Technology and Concentration (ETC Group), and Friends of Earth together with 109 other organisations have called for a global moratorium for the environmental release of synthetic organisms and commercial use of them (Pennisi 2012). Categorical prohibitions are, however, harder to successfully defend than Premise3 as it is presented here. Yet the EU employed such in the case of gene technology. The so-called precautionary principle was used as a justifying reason for the de facto EU Council moratorium on the commercial approval of genetically engineered crops. Between late 1999 and 2004 no authorisations were given. (See e.g. Ahteensuu 2008, 13, see also the Original Publication IV). Premise3 is often an unstated background assumption (or inference), but it, or its modification, is needed if one wants the argument to be logically binding.
Furthermore, it is not reasonable to lump together different branches of SynBio and the techniques used in them. Alexander Kelle points that
different subfields of synthetic biology have different kinds of security implications, which are already relevant or will become so at different points in time. Clearly the potential security implications of synthetic genomics–with its capacity to generate rapidly large DNA molecules–are of more immediate concern than those of some future minimal cell construct that could act as a chassis for nefarious applications even further down the line. (Kelle 2009a, S23).
In a report by the Presidential Commission for the Study of Bioethical Issues (2010, esp. 8,123–127), entitled New Directions: The Ethics of Synthetic Biology and New Technologies, it is suggested that responsible risk governance should be based on prudent vigilance instead of outright bans and the precautionary principle. The report, in fact, concluded that no new regulations were necessary at the time. What should be noted here, however, is that the US seems to be better prepared in regard to SynBioSecurity and that the regulations differ also in regard to gene technology between the US and the EU. In the first-mentioned, the precautionary principle is not applied, at least not explicitly and in the same way as it is done in the EU. The Directive 2001 /18/EC, which is concerned with the deliberate release and placing of genetically modified organisms on the market, states in its General Obligations that
[m]ember States shall, in accordance with the precautionary principle, ensure that all appropriate measures are taken to avoid adverse effects on human health and the environment which might arise from the deliberate release or the placing on the market of GMOs (ibid., Article 4; see also CEC 2000).
Besides explicitly mentioning the precautionary principle several times, it can be argued that the directive builds up a precautionary regulatory framework:
In particular, the precautionary nature of GMO risk governance is reflected by the fact that in environmental risk assessment (e.r.a.), not only direct and immediate but also indirect and delayed effects are considered (see Directive 2001/18/EC, Annex II[A]); by shifting the burden of proof onto potential risk imposers; by the commitment that environmental and human health issues take priority over economic benefits (or concerns); and by the requirement of case by case analysis. (Ahteensuu 2008, 12–13).
The scale of the introduction of GMOs into the environment is increased gradually, step by step (ibid.).
The precautionary principle is also incorporated into the Treaty on European Union since 1992as one of the basic principles upon which all its environmental policy should be based (Article 130r[2] of the Treaty Establishing the European Community).
Lastly, the precautionary principle is referred to in the key objectives of the Cartagena Protocol on Biosafety to the Convention on Biological Diversity (CPB 2000), which regulates the transfer, handling and use of living modified organisms. The EU, but not the US, has ratified the protocol. Granting this, it arguably remains an open question as to what kinds of concrete measures follow from accepting the precautionary principle in the context of SynBioSecurity.
CONCLUSIONSGEN.&EU· The link between the premises and the general conclusion appears sufficiently strong. That part of the argument, now presented in an informative form, can be put in a way that is valid (at least with minor, non-consequential wording modifications). This means that if one accepts the premises, then one has to accept the conclusion in the pursuit of mere logical consistency. What then do the conclusions mean and what follows from them? Here my focus is mainly on the new strain of the argument.
First, even if the gene technology regulatory framework is not in its current form sufficient to guarantee SynBioSecurity alone, the research and commercial use of SynBio are regulated in many other ways, for example, by United Nations’ Biological Weapons Convention (BWC 1972) and national laws such as the Act on Dual-Use Products Export Surveillance in Finland.Footnote 9 It can thus be that additional regulation is not needed or that changes are required in other regulatory contexts than in that of gene technology. This issue depends on the agreements the EU and its member countries have ratified as well as the specific national legislations and practices. For instance, in Finland, Sissonen et al. (2012a) reviewed the current biosecurity legislation and present as their conclusion the following:
in regard to biosecurity, improvement is needed at many places even in legislation in order Finland to be able to fulfil its international duty and to take charge of the prevention of intentional use of biological agents.Footnote 10
Synthetic or edited pathogens can differ substantially from the traditional biological weapons. Based on this it has been suggested that these agreements and specific legislative acts regarding the use, handling and transfer of biological agents may be deficient to guarantee a sufficient level of SynBioSecurity (e.g. Kelle 2009b; see also Sissonen et al. 2012b).
Second, higher or better biosecurity can result from various means and actions. It may turn out that the (1) self-governance (or -policing), which is already practiced, is sufficient. Certainly it is not the case that scientists and the industry would not take biosafety and biosecurity seriously. Other kinds of means are (2) international collaboration and agreements harmonising the governance as well as increasing its transparency, (3) changes to the EU directives, regulations and national laws, and (4) biosecurity training and attempts to increase the SynBioSecurity awareness in other informal ways. Kelle (2009a, b) reports that the awareness of the international discussions and biosecurity guidelines within the European synthetic biologists is relatively low. However, his data is from 2007 and improvements may have taken place, although he himself reckons that they have only been incremental.
It does not automatically follow from the new version of the SynBioSecurity argument that regulations and oversight should be tightened, but only that there seems to be weighty reasons to evaluate whether the chosen level of safety is achieved with the current measures. When one specifies the applications and techniques usable by bioterrorists, the issue, in part, reduces to genetic engineering and especially genome editing. This means that it may be disputable whether it is the case that the risks of SynBio are higher or even different. This, however, raises another issue in regard to whether or not the biosecurity of gene technology and genome editing are at an appropriate level in the face of the recent developments in these fields.
Third, the distinction between biosafety and biosecurity comes to the fore. Even if SynBioSafety in its current form would be sufficient, it does not automatically follow that SynBioSecurity would be at an acceptable or even tolerable level. This is not a conceptual matter or fine-tuning but a relevant distinction because biosafety and biosecurity measures are only partially overlapping and complementary. In other words, it is not possible to deal with biosecurity indirectly by having good biosafety practices in place. Kelle provides an example of this in regard to the so-called safety-mechanisms.
One such example is the idea of engineering biosafety mechanisms into synthetic organisms to make them depend on nutrients that are unavailable in nature. Yet, the principal problem with such a safety system is that someone with malicious intent could possibly short-circuit the fail-safe mechanism. (Kelle 2009a, S23–24).10
The same could happen by spontaneous natural mutation.
More generally, the biosecurity risks reveal limits to the self-governance of the SynBio research community and industry and point to the need for an external supervisory authority. Self-governance may best reach the agents working within the community. Partially owing to this, many support a hybrid approach to SynBio governance. Considering the general argument from SynBioSecurity, many have reached similar conclusions, according to which biosecurity related to SynBio needs to be developed, although they have done so on the basis of slightly differing premises and inferences. This holds at international (for instance IRGC 2010, esp. 40–41; Garfinkle and Knowles 2014; Bügl et al. 2007; Kelle 2009a, esp. S27), the EU (EASAC 2011) and national (in regard to Finland, see Sissonen et al. 2012a) levels.
Much has already been achieved, although the emphasis has typically been on biosafety,Footnote 11 and not on biosecurity. As mentioned, there has been an evaluation process in regard to the regulation of SynBio in the EU. There the emphasis was especially on reviewing whether or not the gene technology regulation and the current risk assessment and management practices are applicable to SynBio. The three opinion statements by the Scientific Committees (2014; 2015a; 2015b) did not address biosecurity. It seems that SynBioSecurity has been the object of discussion and reports to a greater extent in the US (e.g. NSABB 2006; 2010; for a review, see Oye 2012) than in the EU.
In the light of the recent terrorist attacks in different cities in the heart of the EU, it is not (anymore) feasible to hold that Europeans would not be targets, nor that there would not be people with intentions to harm others with the best means to do so at their possession. In other words, it is known that there are terrorists out there and there is evidence of their intentions to maximise damage inflicted (cf. Jefferson et al. 2014, esp. 10–12). Mukunda et al. (2009, 3) note that “[t]here (…) exists a broad consensus that progress in biotechnology is likely to increase the danger from biological weapons, even as there exists a heated debate on the current level of threat they present”.
While the US seems to be better prepared for the SynBioSecurity risks and is in this respect at the moment more precautionary than the EU, there may be challenges there as well. Oye (2012) mentions the next four longer-term concerns:
First, technological advances may render obsolete the current approach to screening DNA sequences by looking for elements of pathogens listed as Select Agents or in Australia Group Guidelines. (…) The nub of the problem is that DNA sequences that are derived from unlisted organisms or created de novo may pose risks but such sequences would not necessarily be detected as parts of listed organisms. Second, technological and economic changes may render the current approach to screening customers obsolete. With the rise of biofabs and intermediaries, the buyers of synthesized DNA will not necessarily be the ultimate users (…) Third, economic and political forces are likely to accelerate the international diffusion of synthesis technologies. (…) Iran and Pakistan appear to be constructing synthesis facilities within their borders. At the domestic level, the screening consortia tend to deny DIYB [biohacker] operators access to synthesized DNA. (…) Fourth, some high end customers in the US and Europe with established track records (…) do not outsource for synthesis services (…) these firms may inadvertently [be] weakening the effectiveness of consortial arrangements that rest on relatively concentrated industrial structure. (Ibid., 11–12).
On similar lines as the first point, Garfinkle and Knowles (2014, 538) state that currently the Select Agent list published by the US National Select Agent Registry Program “is reviewed every two years, and has been criticized for focusing on physical agents rather than the DNA sequences that may be more appropriate”.
This said, responsible governance of SynBio cannot reasonably be decided on the basis of risks alone. In the picture there are on the one hand the safety of researchers, employers, consumers, animals and the environment, and on the other the freedom to pursue science and business, the non-instrumental value of knowledge, and the possible, probable and actualised benefits of the applications. The last-mentioned may include techniques and products to mitigate the biosecurity risks (see Mukunda et al. 2009).
Is There Nothing New Under the Sun?
Is there something new in SynBio, ethically speaking? This question has generated some debate and remained as an open question. (Similar discussions have been common in other fields of life sciences and emerging technologies such as gene technology, nano technology and neuroscience recently). According to Bold and Müller (2008, 387), “the move from engineering organisms in which mere fractions of genomes have been manipulated to the point where significant portions have been designed by humans poses several new ethical issues (…) [W]e propose that synthetic biology raises other ethical questions, questions specific to the field”. Kaebnick and Murray (2013, 2,11), in contrast, state that “[t]he work [in SynBio] raises a welter of ethical concerns, none of which are unprecedented, but which arise in synthetic biology in sharp and sometimes perplexing forms (…) [and] therefore do not constitute a new ethical inquiry”.
For those who propose that new kinds of questions actually come about, typically the new ethical issues or aspects are considered to relate to constructing (or “creating”) life, instead of the earlier modification of it to fullfil certain human needs and wants. While much of this discussion deals with intrinsic concerns, there may be novel extrinsic considerations with policy implications as well. In particular, I have argued above that biosecurity considerations related to SynBio are partially new. The three developments that raise biosecurity risks warrant a review of the regulatory and oversight practices in the EU, specifically the adequacy of the gene technology regulation in this respect. Besides SynBio, the conclusions apply also to genome editing and genetic engineering. The SynBioSecurity argument typically presented in the literature is an argument in favour of more stringent risk management. My suggestion here is that the lack of attention to the biosecurity issues in gene technology regulation in the EU may be ethically problematic or even irresponsible in the light of the three recent developments.
Biosecurity in itself is not an unprecedented extrinsic issue in ethics, but in the case of SynBio one would benefit from further discussion and collaboration between different disciplines and regulatory fields. SynBioSecurity highlights a pressing question related to acceptable levels of risk-exposure of the general population. In particular, in this context there are possible but difficult-to-quantify harms and possible rogue individual or groups’ actions that are practically impossible to supervise. How should the new risks be assessed and managed in a responsible manner? Earlier discussion has admittedly addressed catastrophic risks with extremely low probabilities and the worst-case senarios (e.g. Sunstein 2007; Posner 2004; Jonas 1984). What is different here, however, is that the probability is not necessarily minuscule; it is simply unknown.
One common response to these kinds of threats is to apply the precautionary principle, but this just points to another question of what kind of precautionary measures would be justified in the face of SynBioSecurity risks. Generally speaking, it is often thought that precautionary measures could take the form of outright bans or phaseouts, moratoria, premarket testing, labelling, and requests for extra scientific information before proceeding. Another kind of precautionary response might be establishing new precautionary risk assessment methodologies. The focus is then not only on how to deal with the identified threats, but also on the methods to anticipate and assess threats in the first place. (See Ahteensuu 2008).
One possible counter-argument to my position–and a common reaction against the precautionary principle as well–is that it is far from obvious that we should allocate regulatory resources and take pre-emptive actions in the case of merely possible risks (i.e. outcomes) for which the probability remains unknown. There are other risks that are better known. Would it not then be better to allocate scarce resources (of oversight, for example) to these better-known risks? This would ensure effectiveness and risk reduction. However, given that the malicious use of synthetic or edited pathogens could possibly result in a pandemic (cf. the recent terrorist attacks in Europe), it is even more discomforting not to know the probability of SynBioterrorism (than knowing it to be low or extremely low but possible).
Sometimes the possibility of catastrophe is highlighted and its minuscule probability downplayed. This kind of irresponsible use of rhetoric, and our emotional responses and cognitive limitations to think reasonably about small probabilities is exactly the opposite of what I suggest here. It is the not-knowing that the probability would be minuscule and perhaps negligible, which causes the concern and requires extra attention to risk management. It is of the utmost importance that the discussion on SynBioSecurity and the related risk communication should proceed in a way of not inducing false or unnecessary panic.
Notes
Oligonucleotides are organic molecules consisting of a sequence of nucleotides (composed of nitrogenous base, ribose or deoxyribose, and at least one phosphate group) which are the basic building blocks of DNA and RNA.
The distinction is not as clear-cut as it has sometimes been presented. In a sense (that is less strict than simply considering consequences or not), intrinsic and extrinsic concerns can be intertwined. Sometimes what first appears an intrinsic concern turns out to be an extrinsic one under closer scrutiny. For example, a proponent of a religious version of the playing God argument may, when pushed, appeal to a belief that when certain fundamental boundaries are crossed, the nature will strike back, i.e. certain consequences that are commonly regarded as undesirable will follow from the unbalance inflicted. Other times a concern may embody both what might be called extrinsic and intrinsic features. A risk argument that says that certain form of SynBio gives rise to intolerable risks to the human health and thus should be prohibited may be based on an idea of a natural level of risk related to background conditions an agent or a population faces in her/his/their daily lives. Lastly, the questions related to patenting of the techniques and synthetic DNA or, in the future, higher organisms fall under intrinsic concerns, but in the debate appeals are often made to its (possible, predicted and/or known) consequences. For example, a slippery slope-type of argument says that accepting patents on synthetic life forms, genes or genomes may change the way we view life, i.e. undermine the special moral status of (natural) living systems and the value that we ascribe to them. (For another kind of criticism of the distinction, see Bovenkerk 2012, esp. 22–3.)—Noteworthy is also that although it is common to speak about intrinsic and extrinsic concerns, they might be better termed as intrinsic and extrinsic arguments for or against SynBio. This is because consequence-based reasons and other reasons can be invoked to show that something should prohibited, i.e. is morally problematic, etc., but also that something is morally desirable or even obligatory. In short, extrinsic and intrinsic arguments cut both ways.
The Organisation for Economic Cooperation and Development (OECD) defines biosecurity as “[m]easures to protect against the malicious use of pathogens, parts of them, or their toxins in direct or indirect acts against humans, livestock or crops”.
In the United States, the Centers for Disease Control and Prevention (CDC) report in an unpublished material 395 cases of potential release events at national laboratories working with select agents between 2003 and 2009 (https://www.nap.edu/read/13265/chapter/2#4, 5). In the United Kingdom, reports obtained from the Health and Safety Executive (HSE) reveal similar findings (Sample 2014). However, per lab worker or working hour these risk situations are rare. Accidents causing significant harm such as contracting a disease and especially death are extremely rare.
Dual-use research refers to any research that has legitimate uses, but also brings about the possibility of use for malicious purposes. See e.g. Cirigliano et al.’s (2016) recent review paper, entitled “Biological Dual-Use Research and Synthetic Biology of Yeast”. It is admitted that even the present manuscript might contribute to the biosecurity risks, as it summarises and discusses recent developments in SynBio, some of the dual-use research papers, and more generally the prerequisites for SynBioTerrorism.
Synthetic pathogens are very difficult to construct, but not so as making a nuclear bomb would be, for example.
Three important organisations in the field are the International Gene Synthesis Consortium (IGSC), the International Association of Synthetic Biology (IASB), and the International Consortium for Polynucleotide Synthesis (ICPS) (see e.g. Oye 2012).
The Finnish Act on Export Controls of Dual-Use Products (1996/562). (In Finnish: Laki kaksikäyttötuotteiden vientivalvonnasta.)
See also Sissonen et al. 2012b—In regard to research and policy, Finland has, for example, Research Centre on Biological Threats (in Finnish, Biologisten uhkien osaamiskeskus, BUOS) as part of the National Institute for Health and Welfare.
See e.g. Parens et al. 2009; EGE 2009; IRGC 2010; OECD 2014; founding Ad Hoc Technical Expert Group (AHTEG) on Synthetic Biology; online discussion group on https://bch.cbd.int/synbio/open-ended/discussion.shtml, esp. Topic 4&5; and SYNBIOSAFE project and the related Priority Paper, see www.synbiosafe.eu; Green Paper on Bio-Preparedness by the European Commission in 2007, and Inventory of EU Instruments Relevant for Addressing Chemical, Biological, Radiological and Nuclear Risks (“CBRN Inventory”) in 2008.
References
Ahteensuu, M. (2008). In dubio pro natura? A philosophical analysis of the precautionary principle in environmental and health risk governance. Reports from the Department of Philosophy, University of Turku, Painosalama. An E-version on https://oa.doria.fi/handle/10024/38158.
Ahteensuu, M., & Blockus, H. (2016). Biohacking and citizen engagement with science and technology. In M. Ahteensuu (Ed.), E pluribus unum: Scripta in honorem Eerik Lagerspetz sexagesimum annum complentis (pp. 16–34). Reports from the Department of Philosophy, University of Turku. Turku: Painosalama Oy. An E-version of the book downloadable on https://www.doria.fi/handle/10024/120589.
Bold, J., & Müller, O. (2008). Newtons of the leaves of grass. Nature Biotechnology, 26, 387–389.
Bovenkerk, B. (2012). The biotechnology debate: Democracy in the face of intractable disagreement. New York: Springer.
Bügl, H., Danner, J. P., Molinari, R. J., Mulligan, J. T., Park, H.-O., Reichert, B., Roth, D. A., Wagner, R., Budowle, B., Scripp, R. M., Smith, J. A. L., Steele, S. J., Church, G., & Endy, D. (2007). DNA synthesis and biological security. Nature Biotechnology, 25, 627–629.
Cameron, D. E., Bashor, C. J., & Collins, J. J. (2014). A brief history of synthetic biology. Nature Reviews Microbiology, 12, 381–390.
CEC = Commission of the European Communities. (2000). Communication from the commission on the precautionary principle (Brussels 2nd February 2000 COM[2000]1).
Cello, J., Paul, A. V., & Wimmer, E. (2002). Chemical synthesis of poliovirus cDNA: Generation of infectious virus in the absence of natural template. Science, 297, 1016–1018.
Cho, M. K., Magnus, D., Caplan, A. L., McGee, D., & the Ethics of Genomics Group. (1999). Ethical considerations in synthesizing a minimal genome. Science, 286, 2087–2090.
Church, G. M., Elowitz, M. B., Smolke, C. D., Voigt, C. A., & Weiss, R. (2014). Realizing the potential of synthetic biology. Nature Reviews Molecular Cell Biology, 15, 289–294.
Cirigliano, A., Cenciarelli, O., Malizia, A., Bellecci, C., Gaudio, P., Lioj, M., & Rinaldi, T. (2016). Biological dual-use research and synthetic biology of yeast. Science and Engineering Ethics. doi:10.1007/s11948-016-9774-1.
CPB = Secretariat of the Convention on Biological Diversity. (2000). Cartagena protocol on biosafety to the convention on biological diversity: Text and annexes, montreal.
Cressey, D. (2007). Not so secure after all. Nature, 448, 732–733.
Directive 2001/18/EC of the European Parliament and of the Council of 12 March 2001 on the Deliberate Release into the Environment of Genetically Modified Organisms and Repealing Council Directive 90/220/EEC.
Directive 2009/41/EC of the European Parliament and of the Council of 6 May 2009 on the Contained Use of Genetically Modified Micro-Organisms.
Douglas, T., & Savulescu, J. (2010). Synthetic biology and the ethics of knowledge. Journal of Medical Ethics, 36, 687–693.
EASAC = European Academies Science Advisory Council. (2011) Synthetic biology: An introduction. On www.easac.eu.
EEA = European Environment Agency. (2001). Late lessons from early warnings: The precautionary principle 1896–2000. On http://reports.eea.eu.int/environmental_issue_report_2001_22/en/Issue_Report_No_22.pdf.
EGE = European Group on Ethics in Science and New Technologies to the European Commission. (2009). Ethics of synthetic biology. Opinion No 25. Brussels.
Garfinkel, M. S., Endy, D., Epstein, G. L., & Friedman, R. M. (2007). Synthetic genomics: Options for governance. J. Craig Venter Institute.
Garfinkle, M., & Knowles, L. (2014). Synthetic biology, biosecurity, and biosafety. In R. L. Sandler (Ed.), Ethics and emerging technologies (pp. 533–547). London: Palgrave MacMillan.
Heavey, P. (2013). Synthetic biology ethics: A deontological assessment. Bioethics, 27, 442–452.
Herfst, S., Schrauwen, E. J. A., Linster, M., Chutinimitkul, S., de Wit, E., Munster, V. J., Sorrell, E. M., Bestebroer, T. M., Burke, D. F., Smith, D. J., Rimmelzwaan, G. F., Osterhaus, A. D. M. E., & Fouchier, R. A. M. (2012). Airborne transmission of influenza A/H5N1 virus between ferrets. Science, 336, 1534–1541.
Hsu, P. D., Lander, E. S., & Zhang, F. (2014). Development and applications of CRISPR-Cas9 for genome engineering. Cell, 157, 1262–1278.
Imai, M., Watanabe, T., Hatta, M., Das, S. C., Ozawa, M., Shinya, K., Zhong, G., Hanson, A., Katsura, H., Watanabe, S., Li, C., Kawakami, E., Yamada, S., Kiso, M., Suzuki, Y., Maher, E. A., Neumann, G., & Kawaoka, Y. (2012). Experimental adaptation of an influenza H5 HA confers respiratory droplet transmission to a reassortant H5 HA/H1N1 virus in ferrets. Nature, 486, 420–428.
International Risk Governance Council (IRGC). (2010). Guidelines for the appropriate risk governance of synthetic biology. Geneva. https://www.irgc.org/IMG/pdf/irgc_SB_final_07jan_web.pdf.
Jackson, R. J., Ramsay, A. J., Christensen, C. D., Beaton, S., Hall, D. F., & Ramshaw, I. A. (2001). Expression of mouse interleukin-4 by a recombinant ectromelia virus suppresses cytolytic lymphocyte responses and overcomes genetic resistance to mousepox. Journal of Virology, 75(3), 1205–1210.
Jefferson, C., Lentzos, F., & Marris, C. (2014). Synthetic biology and biosecurity: Challenging the “myths”. Frontiers in Public Health, 2(115), 1–15.
Jonas, H. (1984). The imperative of responsibility. In search of an ethics for the technological age. Chicago: University of Chicago Press.
Kaebnick, G. E., & Murray, T. H. (Eds.). (2013). Synthetic biology and morality: Artificial life and the bounds of nature. Cambridge: MIT Press.
Kaiser, J. (2015). U.S. high-containment biosafety labs to get closer scrutiny. Science (29th Oct.). http://www.sciencemag.org/news/2015/10/us-high-containment-biosafety-labs-get-closer-scrutiny.
Kelle, A. (2009a). Synthetic biology and biosecurity. EMBO Reports, 10, S23–S26.
Kelle, A. (2009b). Security issues related to synthetic biology: Between threat perceptions and governance options. In M. Schmidt, A. Kelle, A. Ganguli-Mitra, & H. de Vriend (Eds.), Synthetic biology: The technoscience and its societal consequences (pp. 101–119). New York: Springer.
Kwok, R. (2010). Five hard truths for synthetic biology. Nature, 463, 288–290.
Lusser, M., Parisi, C., Plan, D., & Rodríguez-Cerezo, E. (2012). Deployment of new biotechnologies in plant breeding. Nature Biotechnology, 30(3), 231–239.
Mukunda, G., Oye, K. A., & Mohr, S. C. (2009). What rough beast? Synthetic biology, uncertainty, and the future biosecurity. Politics and the Life Sciences, 28(2), 2–26.
NSABB = National Science Advisory Board for Biosecurity. (2006). Addressing biosecurity concerns related to the synthesis of select agents. http://osp.od.nih.gov/sites/default/files/resources/Final_NSABB_Report_on_Synthetic_Genomics.pdf.
NSABB = National Science Advisory Board for Biosecurity. (2010). Addressing biosecurity concerns related to synthetic biology. http://osp.od.nih.gov/sites/default/files/resources/NSABB%20SynBio%20DRAFT%20Report-FINAL%20%282%29_6-7-10.pdf..
OECD. (2014). Emerging policy issues in synthetic biology. Paris: OECD Publishing.
Oye, K. A. (2012). Proactive and adaptive governance of emerging risks: The case of DNA synthesis and synthetic biology. A paper prepared for the International Risk Governance Council (IRGC).
Parens, E., Johnston, J. & Moses, J. (2009). Ethical issues in synthetic biology: An overview of the debates. Report by Woodrow Wilson International Center for Scholars and the Hastings Center.
Pennisi, E. (2012). 111 organizations call for synthetic biology moratorium. Science (13th March). http://www.sciencemag.org/news/2012/03/111-organizations-call-synthetic-biology-moratorium.
Pierce, R. L. (2012). Whose ethics of knowledge? Taking the next step in evaluating knowledge in synthetic biology: A response to Douglas and Savulescu. Journal of Medical Ethics, 38, 636–638.
Posner, R. A. (2004). Catastrophe: Risk and response. Oxford: Oxford University Press.
Presidential Commission for the Study of Bioethical Issues. (2010). New directions: The ethics of synthetic biology and new technologies. Washington.
Randerson, J. (2006). The lax laws could allow assembly of deadly virus DNA. The Guardian (14th June). https://www.theguardian.com/world/2006/jun/14/terrorism.topstories3.
Sample, I. (2014). 100 Safety breaches at UK labs handling potentially deadly diseases. The Guardian (4th Dec.). https://www.theguardian.com/science/2014/dec/04/-sp-100-safety-breaches-uk-labs-potentially-deadly-diseases.
Schmidt, M. (2009). Do I understand what I can create? Biosafety issues in synthetic biology. In M. Schmidt, A. Kelle, A. Ganguli-Mitra, & H. de Vriend (Eds.), Synthetic biology: The technoscience and its societal consequences (pp. 81–100). New York: Springer.
Scientific Committees/European Commission. (2014). Opinion on synthetic biology I: Definition. http://ec.europa.eu/health/scientific_committees/emerging/docs/scenihr_o_044.pdf.
Scientific Committees/European Commission. (2015a). Opinion on synthetic biology II: Risk assessment methodologies and safety aspects. https://ec.europa.eu/health/sites/health/files/scientific_committees/emerging/docs/scenihr_o_048.pdf.
Scientific Committees/European Commission. (2015b). Final opinion on synthetic biology III: Risks to the environment and biodiversity related to synthetic biology and research priorities in the field of synthetic biology. http://ec.europa.eu/health/scientific_committees/emerging/docs/scenihr_o_050.pdf.
Sissonen, S., Kinnunen, P. M., Vakkuri, A., Poutiainen, S., Raijas, T., Salminen, M., & Nikkari, S. (2012a). Biouhilta turvassa?—Säädökset suojaavat työntekijää ja yhteiskuntaa. Duodecim, 128, 2217–2223. (In English: Safe from biothreats? Legislation protects you and society).
Sissonen, S., Raijas, T., Haikala, O., Hietala, H., Virri, M., & Nikkari, S. (2012b). Biologisten aseiden kieltosopimuksen uudet haasteet. . Duodecim, 128, 283–289. (In English: New challenges to the biological weapons convention).
Smith, K. (2013). Synthetic biology: A utilitarian perspective. Bioethics, 27, 453–463.
Sunstein, C. R. (2007). Worst-case scenarios. Cambridge: Harvard University Press.
Treaty on European Union. (1992). In Official Journal C 191, 35 (29th July). http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ:C:1992:191:FULL&from=EN.
Tumpey, T. M., Basler, C. F., Aguilar, P. V., Zeng, H., Solórzano, A., Swayne, D. E., Cox, N. J., Katz, J. M., Taubenberger, J. K., Palese, P., & García-Sastre, A. (2005). Characterization of the reconstructed 1918 Spanish influenza pandemic virus. Science, 310, 77–80.
Weiss, S., Yitzhaki, S., & Shapira, S. C. (2015). Lessons to be learned from recent biosafety incidents in the United States. IMAJ, 17, 269–273.
Acknowledgements
Parts of this paper are based on an earlier publication in Finnish: Ahteensuu, M. (2015). Synteettisen biologian etiikka: bioturvaamisnäkökohtia. Dosis, 31(4/2015), 228–240. I presented the new strain of the SynBioSecurity argument at an international conference “Bioethics: Preparing for the Unknown” (Kalamazoo, Michigan, 17th–18th March 2016) and at the Högre seminarier i Filosofi (Division of Philosophy at the Royal Institute of Technology [KTH], Stockholm, 13th April 2016). The comments of the participants and subsequent discussions with William Bülow, Björn Lundgren and Per Wikman-Svahn were highly useful. I would also like to thank two anonymous reviewers and Co-Editor-in-Chief Ray Spier for perspicacious and helpful suggestions as well as Susanne Uusitalo for checking the language.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Ahteensuu, M. Synthetic Biology, Genome Editing, and the Risk of Bioterrorism. Sci Eng Ethics 23, 1541–1561 (2017). https://doi.org/10.1007/s11948-016-9868-9
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11948-016-9868-9