Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 Introduction

Complexity appears to be a characteristic and inherent feature of all living beings. The high degree of functionality and the intricate organization of biological systems have even been regarded as proof for the existence of an ingenious creator, named God (Paley 1802). Since Darwin, however, we know that life on earth is not the creation of an intelligent designer but rather the product of chance mutation and selection. Even more: evolution as a trial and error process resembles more tinkering then rational design as François Jacob once insightfully remarked (Jacob 1977). Nevertheless, evolution has brought about all the astonishing phenomena of life that have both fascinated biologists and inspired engineers for technological inventions. But in a modern view biological complexity even reaches further. It does not only refer to the inner organization of organisms, but also encompasses their manifold interactions with other living beings and their common environment. This ecological complexity depends upon species diversity resulting from evolutionary adaptation and specialization. The complex structure of ecosystems has already been recognized by Darwin:

It is interesting to contemplate an entangled bank, clothed with many plants of many kinds, with birds singing on the bushes, with various insects flitting about, and with worms crawling through the damp earth, and to reflect that these elaborately constructed forms, so different from each other, and dependent on each other in so complex a manner, have all been produced by laws acting around us (Darwin 1859).

Synthetic biology is an engineering technology based on living systems and aims at the design and construction of novel biological parts, devices and systems for useful purposes; alternatively redesign of existing, natural biological systems can be used for the same purpose (definition of synthetic biology at http://www.syntheticbiology.org; Knight 2005). The idea of engineering living substances is not completely new in biology (Campos 2009) and ‘genetic engineering’ emerged as a scientific enterprise immediately with the advent of recombinant DNA technology in the 1970s (Jackson et al. 1972). However, when contemporary engineers revisited the field 30 years later, gene technology appeared to them as “still an expensive, unreliable and ad hoc research process” (Endy 2005). As a reaction to this perception, a manifesto was published (“Foundations for Engineering Biology”) to promote transformation of biology into an engineering discipline (Endy 2005). This was possible since in the meantime reading and writing of DNA became available on a large scale and at low cost (Pettersson et al. 2009; Tian et al. 2004). This progress was largely due to the human genome sequencing project that had pushed the development of new methods. This remarkable scientific and methodological progress not only provided hundreds of genome sequences but also paved the way to synthesize complete genomes from scratch. The technological breakthrough in DNA technology finally attracted scientists from outside biology. Especially scientists trained in the traditional fields of mechanical, chemical or electrical engineering were drawn into the new science of synthetic biology. It was these engineers who proposed to introduce into biology the principles of standardization, modularization and automatization which had made the great successes of classical engineering possible in the 20th century (Endy 2005).

2 Getting Rid of Complexity

However, classical engineers deal with energy or inanimate matter, while biological engineers have to deal with living systems that are characterized by their astounding complexity. This immediately posed a problem for these engineers turned synthetic biologists: while conventional biologists appear to be especially attracted by the complexity of living systems, engineers try to avoid unnecessary complexity as far as possible (Breithaupt 2006). These different points of view are best characterized in a statement by one of the promoting figures of early synthetic biology and founder of the biobricks registry, Tom Knight:

Here is the difference between a biologist and an engineer: A biologist goes into the lab, studies a system and finds that it is far more complex than anyone suspected. He’s delighted; he can spend a lot of time exploring that complexity and writing papers about it. An engineer goes into the lab and makes the same finding. His response is: ‘How can I get rid of this?’ (Brown 2004).

Thus, the immense complexity of living systems appears to them more as a technical obstacle than as a scientific challenge. Engineering-oriented synthetic biologists want to streamline their synthetic creations and to get rid of the detritus of evolution. But can we actually eliminate the ‘messiness’ of biology? And what makes the biological substrate different from other substrates that we engineer? (O’Malley et al. 2008).

At least for some synthetic biologists the difference between biological substrates and those that are normally engineered is not so large. Some of them regard Nature itself as a technology:

Biology is the oldest technology. Throughout the history of life on Earth, organisms have made use of each other in sophisticated ways. Early on in this history, the ancestors of both plants and animals co-opted free-living organisms that became the subcellular components now called chloroplasts and mitochondria. These bits of technology provide energy to their host cells and thereby underpin the majority of life on this planet. (Carlson 2010)

Thus, natural systems built by biological evolution can also be seen as technology-based in an emphatic sense. This view is further corroborated by the analogies between the modular and layered structure of technical systems and the comparable design of living cells (Andrianantoandro et al. 2006). The different layers of parts, devices and modules of a computer e.g. resistors, capacitors and transistors on the physical level, integrated circuits, logical gates and processors at higher levels etc. are compared with biological molecules, that are connected by biochemical reactions to form biological devices and modules. Thus, if biological cells are by themselves already organized as parts, devices and modules, then it appears rather natural to improve living systems further by implementing explicit technical standards.

3 Different Strategies to Reduce Complexity

Synthetic biology is often classified into different fields or branches according to certain criteria. Most popular is the distinction between top-down and bottom-up approaches. Top-down means the redesign of existing cells by downsizing and minimization, while bottom-up indicates all attempts to construct synthetic cells “from scratch.” Here, I divide synthetic biology according to alternative strategies how to deal with biological complexity. These are (1) standardization and modularization or (2) orthogonalization via biochemical or genetic alterations. Both strategies aim at reducing complexity but operate at different levels. Modularization implies the rigorous redesign of complex metabolic pathways and signalling networks to generate a highly integrated system, whose behaviour is computable and thus largely predictable. This reduces the complexity of the inner organization of living beings, but does not affect interactions with natural organisms in the ecosystem. As an alternative, it is proposed to isolate synthetic cells from interactions with other living systems by implementation of a genetic firewall. This can be achieved by different means: the most extreme would be the construction of cells in which the genetic information is stored not in DNA but in alternative molecules commonly termed XNA (for xeno-DNA). This will prevent any exchange of genetic information with natural biological systems be it by mating or horizontal gene transfer. Reduction of complexity occurs here at an ecological level, but does not necessarily require the reduction of functional complexity of these cells. These two approaches are somewhat complementary, not mutually exclusive; and intermediate solutions of complexity reduction at both the functional and the ecological levels can be contemplated. For example, the construction of refactored cells based on a non-universal codon table includes modularity and genetic isolation. Both controllability and genetic interactions of synthetic cells in the natural environment have to be considered as factors to assess the potential risk of synthetic organisms. Therefore, the approaches to reduce complexity in synthetic biology differ not only in their general strategy but also in their underlying concepts concerning safety and security aspects. I will briefly touch on this aspect at the end of this contribution.

4 Standardization and Modularization

Even purely technical systems sometimes display unwanted behaviour if the degree of complexity exceeds a certain level. This unpredictability usually results from the interplay of the large number of interacting parts and components. In biological systems, phenomena like stochastic noise and chance mutations further enhance this unpredictability (Maheshri and O’Shea 2007; Raj and van Oudenaarden 2008; Eldar and Elowitz 2010). Therefore, reduction of unnecessary complexity is a good means to gain better control in large-scale systems. The most prevalent approach in synthetic biology is the implementation of classical engineering standards. Living systems will be transformed into controllable technical devices by refactoring their genomes on the basis of a minimal chassis cell. These streamlined cells can then be used for useful purposes such as medical applications, generation of biofuel or to detoxify environmental pollution.

One of the first examples for this streamlining was the refactoring of bacteriophage T7 (Chan et al. 2005). In software technology, refactoring means the restructuring of existing computer code to improve readability and to reduce complexity. In the case of bacteriophage T7 all overlapping genes were disentangled and ordered in linear fashion. The viability and virulence of the refactored phage demonstrates that the redesigned phage has maintained the key features of the original and was still able to complete its life cycle. At the same time this indicates that the bacteriophage genome contains no hidden features or genetic elements that have been overlooked. In addition, the refactored genome is much simpler to model and to manipulate. Therefore refactoring of existing biological systems also helps to understand the inherent design principles of natural living systems. This knowledge can then be used to design and construct novel cells that have never existed before and with properties not yet realized in nature. This situation resembles the enormous progress in organic chemistry in the mid-nineteenth century. This interplay between analysis and synthesis, the understanding of fundamental principles of chemical structure and reactivity allowed the synthesis of artificial organic molecules that did not exist in nature such as polymers and pharmaceuticals (Yeh and Lim 2007).

To refactor existing living systems or to design novel cells, synthetic biologists refer to the classic repertoire of engineering principles to reduce complexity. The most important aspect is the use of standardized modules whose functional properties are known and can be described quantitatively (Canton et al. 2008). Modularization has to be achieved at all levels, i.e. at the level of parts, devices and systems. Only then can these parts and devices be combined in all possible ways to construct higher-order systems with useful properties. To integrate such standardized parts and devices into more complex systems, engineers working at different levels then use an information hierarchy that facilitates communication (Endy 2005). This is possible since synthetic biologists can use these standardized parts and devices without full knowledge of their interior design. Since all modules are described in their functional properties in quantitative terms and use standardized input/output systems, they can effectively be regarded as “black boxes” (Endy et al. 2005). The information hierarchy may be best illustrated with engineers collaborating during construction of computers. Only standardization guarantees that engineers working at a high level of system integration, e.g. at architecture of central processing units, can communicate with engineers designing logical gates and vice versa.

Standardization in synthetic biology is best exemplified through the biobricks registry (http://partsregistry.org/) and the international student competition iGEM (Smolke 2009). This steadily expanding open-source depository of DNA sequences provides standardized biological parts and devices that can be used to assemble larger functional modules (Knight 2003; Canton et al. 2008). The long-term goal of this endeavour is to provide a toolbox for designing whole cells. While refactoring of small genomes such as those of bacteriophages might be reached in a single step, redesign of cells is normally accomplished in two steps. First, a cell with a minimal genome is constructed. Such a cell would contain only the essential biological pathways to avoid any adverse effects that might occur by interference with other pathways and metabolic processes. This makes the behaviour of this minimal cell much more predictable. The functionality of the simplified cell can then be expanded by implementation of additional genes designed for specific purposes. It thus serves as a reliable platform (chassis) for the build-up of tailor-made cells with useful properties. The idea of a chassis is an important aspect of these novel cells designed by rational principles and not by contingent evolution. The creation of a bacterial cell controlled by a chemically synthesised genome (Gibson et al. 2010) demonstrates that it is even feasible to refactor a whole bacterial cell by designing its complete genome. This provides nearly unlimited possibilities to endow a minimal cell with all genes necessary and sufficient for stable and robust growth.

5 Orthogonalization and the “Genetic Firewall”

Another important aspect of refactoring is orthogonalization. This term is used in analogy to the design of electric circuits, where crosstalk between signalling channels has to be avoided for proper functioning. In synthetic biology, orthogonalization means the elimination of any unwanted interaction between components of biological processes that occur concomitantly in the same cell. If minimal chassis cells are endowed with new features by implanting synthetic parts and devices, it must be guaranteed that these novel functions neither affect each other nor the basal metabolism of the chassis cell. This requires careful design of all parts and components and can, for example, be achieved by using genes or proteins from unrelated species that are unlikely to interact with compounds of the host cell. Alternatively, synthetic signalling molecules can be rationally designed on the basis of existing sets of protein kinases and DNA binding proteins (Dueber et al. 2004; Pryciak 2009; Kiel et al. 2010; Lim 2010; Slusarczyk et al. 2012). Also synthetic expansion of the genetic code further enhances the level of orthogonality. Mutually orthogonal pairs of aminoacyl-tRNA-synthase/tRNA pairs have been generated in vivo to expand the genetic code. This allows selective incorporation of unnatural amino acids into proteins in vivo (Neumann et al. 2010).

Orthogonality can also be achieved at the level of whole cells. In this context, orthogonality indicates the biochemical and/or genetic isolation of synthetic cells from other natural organisms. This is reached by targeted alterations of basal metabolic and genetic processes. The most far-reaching alteration is the construction of cells that are based on chemistry distinct from that of natural organisms. The major challenge of such a ‘xenobiological’ approach is to construct “natural” cells from unnatural substances. In this respect, xenobiologists even claim to be the proper synthetic biologists, because the engineering branch of synthetic biology only seeks interchangeable parts from natural biology to assemble into systems that function unnaturally (Benner and Sismour 2005). As mentioned above, the major advantage of any xenobiological approach is the general isolation of these new forms of life from the natural world. Due to the changes in their information storage-molecules these cells are “invisible” to conventional biological systems and thus can be regarded as environmentally safe (Schmidt 2010). The xenobiological concept of biosafety by chemistry has also been propagated under the slogan “The farther, the safer.” The idea behind this motto is that synthetic species with chemical constitutions as deviant as possible from that of natural species carries the least risk of dissemination and contamination of wild habitats, including the human body (Marlière et al. 2011; Herdewijn and Marlière 2009).

Even if realizations of fully xenobiological cells still seem to be far away, other options to reach orthogonality at the level of the organism have already been achieved. To name but a few, incorporation of unusual or even toxic bases into the DNA has been shown to generate “chemically modified organisms” (Marlière et al. 2011). Elimination of one of the three translation termination codons has already been realized in E. coli (Isaacs et al. 2011; Lajoie et al. 2013b) and also the limits of genetic recoding in essential genes have been probed (Lajoie et al. 2013a). The feasibility of creating cells controlled by chemically synthesized genome (Gibson et al. 2010) makes it even possible to reassign more than a single codon. At least theoretically, an organism which uses a genetic code completely different from the universal would be totally isolated from any exchange with other living beings.

George Church, one of the leading figures in synthetic biology brought up in his book “Regenesis” the idea of creating mirror-like bacterial cells or even mirror-like humans (Church and Regis 2012). While the latter is clearly out of the question, the former might be an attractive option to create fully viable bacterial cells that are isolated from the natural environment. According to the physical and chemical laws such cells will behave exactly like the wild type form, with the only exception that all biochemical molecules of these cells would be stereoisomers of their natural counterparts. One immediate advantage of such cells would be that they are completely resistant to the attack of all existing bacteriophages. This demonstrates that orthogonalization can serve as a biosafety tool and allows the construction of synthetic cells which maintain their full inherent complexity without need to worry about their genetic interaction with the natural world. The concept of a genetic firewall does not reduce the inherent complexity of a biological cell but only minimizes its interaction with the environment.

6 Safety Aspects

As mentioned above, the different strategies to reduce complexity in synthetic biology come along with alternative concepts concerning safety and security of these synthetic constructs. For engineers, computability and predictability of refactored cells guarantees controllability. Chemists and biologists, however, are aware of the enormous complexity of living systems, and might trust more in genetic and biochemical firewalls. Both concepts have their pros and cons and might apply differently for specific applications. For example, the safety of cells that are cultivated in closed containments (fermenters) will be viewed other than that of cells that are planned to be used in the environment.

6.1 Safety by Computability and Predictability

A major claim of engineering synthetic biology is that its methods will guarantee high predictability and reproducibility. This claim is justified by the use of modularized and standardized parts that have been quantitatively characterized and thus provides a high degree of computability. Therefore, the major tenet of this concept is that reduction of complexity by rational design provides us with control over these cells. They can thus be compared with a technical “system” whose behaviour is determined by its technical parameters. Quantitative description of modules and knowledge of their interactions within a network allows us to predict the future states of this system with high precision and reliability. This concept has its roots in the world view of engineers: complex systems like computers are built from simple components like graphics cards, processing units, integrated circuits, transistors etc. Each of these components is very well characterized and functions deterministically. This engineering principle allows the precise construction of highly integrated systems like airplanes which we often use, fully confident of their fail-safe design.

But within such a technical approach, a biological risk may come not only from the inherent unpredictability of any organisms that might be retained in spite of all engineering, but also from recent experiences with large-scale projects involving highly complex technologies, such as nuclear power plants or large electric power transmission grids. We experience in our daily life that even small-scale technology (like personal computers) often crashes. In this case, the computers just need to be rebooted. Failure of nuclear power plants, however, may result in nation-wide blackouts or may even make large areas uninhabitable. In all these cases it is the large size of these highly integrated modular systems that obviously inherently bears a risk of unpredictable behaviour.

While genetically modified organisms, in which single or only few genes have been manipulated or been introduced, may be regarded as safe, the high complexity of organisms carrying diverse genes of different origin or even designed genes with no natural counterparts, may carry a risk similar to highly complex technical systems. This does not necessarily enhance the actual risk in terms of potential damage or danger but results in a remaining unpredictability, which maybe inevitably sticks to artificial cells. In contrast to many technical systems where risk assessments can be made more or less precisely (even for worst-case scenarios) this appears difficult for synthetic biology. The potential damage (if any) is hard to estimate and at the same time the probability of occurrence is nearly indeterminable. Therefore these systems are afflicted rather with uncertainty than with risk. Beside these deliberations on safety aspects, one might also have to consider security aspects. All technically “useful” devices can be misused by malevolent parties as weapons or for terroristic attacks. Therefore dual-use aspects of purported harmless material have to be considered under these assumptions. But this goes beyond the scope of this contribution.

6.2 Safety by Genetic Isolation ‘…the Farther, the Safer’

One of the strong arguments to follow the path of genetic isolation is the safety aspect under the motto: “The farther, the safer” (Marlière et al. 2011; Herdewijn and Marlière 2009). The idea behind this approach is to keep all the unpredictability inherent to life, but to control it by efficiently preventing any interaction between artificial cells and natural cells. This can be reached by different means and to different degrees. Such artificial living beings are thus separated from nature by a genetic fence. Although this approach might be theoretically tight it leaves many observers with the same feeling one has watching wild animals in a zoo behind a glass window or a moat. One might think, “What happens, if…?” Already the announcement of the possible creation of mirror-like cells has sparked a similar reaction: “Mirror-image cells could transform science—or kill us all” (http://www.wired.com/magazine/2010/11/ff_mirrorlife/). Although we cannot predict by which means a genetic firewall might flop, the public is left with a strong feeling of uncertainty. In this case the fear is still enhanced by the unfamiliarity of such artificial creatures. Thus, the alien character of xenobiological organisms might be a severe disadvantage in any biosafety and biosecurity debate, although it is claimed that “the farther, the safer” could be regarded as a principle to make synthetic biology less dangerous. But the gain of having organisms that are unable to communicate or to admix with natural beings might be by far outweighed by the public’s fear of the unknown.

7 Which Risk Remains?

As we have learned, it will be difficult to assess the risk of synthetic biology in general. Synthetic biology as engineering technology based on living systems, claims that reduction of complexity in one way or the other is the best way to create living cells that do not harm humankind or the environment. However, it is well known that even in classical engineering technologies, the construction of ever more complex systems is accompanied by an increase of inherent instability and uncertainty. Even for mathematics it was proven that every axiomatic system will contain statements that cannot be decided. Thus we are left with a level of uncertainty even in a world of complete predictability. Therefore it may be less important for synthetic biology to ensure the public of the general safety of their approaches, but to implement additional control mechanisms and information duties that may strengthen the public faith in this emerging technology.