Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

3.1 Context

Thousands of chemicals need to be evaluated for regulatory purposes. For example, large endeavours such as the European Union’s Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) program, the U.S. EPA ToxCast program, and the Chemicals Management Plan (CMP) in Canada were implemented in recent years to address legislative obligations and take action on chemicals believed to be harmful. However, these regulatory programs face major hurdles. Foremost, the number of chemical substances for which toxicity data are required is tremendous and backlogged (e.g., 85,000 on U.S. Toxic Substances Contract Act inventory; 23,000 under Canada’s Domestic Substances List; 107,000 chemicals in EU manufactured within or imported into region in quantities exceed 1000 tons). This number continues to grow, and is substantially higher when considering the complex environmental samples (e.g., effluents) that need testing.

Historically, testing chemicals has relied on in vivo studies that use whole animals. In many respects, in vivo toxicity testing responds to the concept of “one problem, one test” (Hartung 2009), which implies that a single animal study is conducted to relate the effects of a single chemical with a single adverse outcome. A major consequence of this is that only few classes of contaminants have been subjected to intensive testing. There remains thousands of chemicals (including mixtures) for which few or no test data are available (Judson et al. 2009). In addition, these types of studies yield findings that are largely descriptive, and the work is time consuming and prohibitively costly. For example, the U.S. EPA estimates that traditional testing of a single chemical may take 4 years. and cost $1-20 M USD (Martin et al. 2012). The EU REACH program realistic case scenario calculates the need for 54 million vertebrate animals and $13.6B USD to achieve registration goals (Rovida and Hartung 2009). These realities represent major barriers to fulfilling legal obligations to manage chemicals.

The aforementioned limitations have been recognized by the U.S. National Research Council (NRC) in their document entitled “Toxicity Testing in the 21st Century: A Vision and a Strategy” (NRC 2007). The main outcome of this NRC document was the recommendation of a new, predictive strategy as the cornerstone of 21st century toxicity testing. This predictive strategy is based on understanding and applying in vitro toxicity assays which predict cellular level effects that can next be extrapolated to effects on individuals. It de-emphasizes the need to base assessments on animal tests, thus promoting the 3-Rs principle for humane animal research that was developed over 50 years ago (Russell et al. 1959). This new strategy harnesses recent advancements in the fields of cellular and molecular biology, toxicology, and computational biology among others. For example, advances in measurement technologies and fundamental toxicological understanding at the molecular level (i.e., transcriptomics, proteomics, metabolomics) have increased the amount and types of information available and potentially useful to risk assessors (Ankley et al. 2010). These are now contributing towards the development of New Approach Methodologies (NAMs) as discussed in a recent workshop by the European Chemicals Agency (ECHA 2016).

A major conclusion of the NRC report was the expansion and utilization of in vitro tools in chemical risk assessment. In particular, the report articulated a need to establish in vitro tests that can prioritize, screen and evaluate a large number of chemicals (i.e., thousands) in a relatively short period of time (days to weeks). Regarding in vitro tests that span a multitude of molecular, biochemical and physiological systems, the expectation is that advanced computational and bioinformatics platforms could integrate the complex data streams and predict whole organismal impacts. Such a plan lies at the heart of predictive toxicology. This is the basis of an ambitious program launched by the U.S. Environmental Protection Agency in 2007 called Toxicity Forecaster (ToxCast™) (Judson et al. 2010). As detailed elsewhere (Dix et al. 2007), ToxCast is comprised of several in vitro, automated chemical screening technologies that provide a cost-effective and rapid approach to screen for changes in biological activity in response to chemical exposure. The program has nearly 1000 high-throughput and automated assays in its repertoire that cover approximately 300 signalling pathways. The program has screened thousands of chemicals including 300 well-studied chemicals that have undergone extensive animal testing (Phase 1, Proof of Concept; (Judson et al. 2010; Martin et al. 2011; Sipes et al. 2011; Kavlock et al. 2012; Padilla et al. 2012)), >2000 chemicals from a broad range of sources including consumer products, green chemicals, and food additives (Phase 2, (Rotroff et al. 2013; Sipes et al. 2013)), and ~800 chemicals that are known or suspected endocrine disruptors (E1K library; Karmaus et al. 2016). In a recent paper, ToxCast scientists screened 10,000 chemicals (15 concentrations of each chemical in 3 independent experiments) through 30 different cell-based assays (Huang et al. 2016), and components of the testing platform are hailed to be able to screen 10,000 chemicals within a week (Attene-Ramos et al. 2013). Performing the same work in animals would have taken years and millions of dollars. Clearly the cost/performance ratio makes these attractive as tools to screen, prioritize and evaluate a large number of chemicals, and thus meet regulatory obligations as well as help satisfy societal concern.

The development of NAMS, particularly new in vitro tools for testing chemicals such as those referred to above has near-exclusively been focused on human health applications. Unfortunately they are of limited use in the ecological sciences in which many more species (and their complex interactions) are under scrutiny. Very few in vitro toxicity testing tools exist for the most standard ecotoxicological test species, and there is almost nothing for native species of ecological relevance. This is problematic since the extrapolation of results across species (i.e., from standard test species to native species of ecological relevance) introduces tremendous uncertainty, as does extrapolation from controlled laboratory tests to real-world environments (Villeneuve and Garcia-Reyero 2011). For example, native bird species can be more sensitive or respond differently to chemicals than the standard lab model (Head et al. 2008). These types of differences complicate decision-making and often necessitate additional testing.

There is a clear need to accelerate the development and application of novel in vitro toxicity testing tools for the purposes of ecological risk assessment, and this has been recognized by leading scholars in the field (Villeneuve and Garcia-Reyero 2011). As such, the purpose of this chapter is to describe cell-free assays, and propose them as a species agnostic, in vitro toxicity-testing tool of potential relevance to ecological risk assessment. The chapter describes cell-free tests and how they are conducted, and also provides examples from the literature. In doing so, the chapter aims to show that cell-free tests are an attractive tool that can be used in predictive ecotoxicology especially considering the limited availability of test organisms (particularly species that are at-risk, difficult to maintain in captivity, etc.), lack of proven cell-based tools (e.g., cell cultures and cell lines), societal concerns over animal testing, sheer number of ecological species to study, and vast inter-species differences.

3.2 Description of Cell-Free Assays

Cell-free assays are simplified in vitro platforms that can help evaluate the effects of a test chemical on a biochemical process. A number of other in vitro approaches are also employed in toxicology such as primary cell cultures and immortalized cell lines. These have the advantages of better retaining in vivo tissue-specific characteristics and cell line longevity thus in some cases facilitating the study of functional pathways (Bhogal et al. 2005) (Fig. 3.1). However, over time they tend to lose in vivo properties and cell lines are available only for a select number of species suited for laboratory studies. In comparison, while cell free platforms, typically performed in tissue homogenates, cell lysates or on purified molecules, might represent an over-simplified approach, with careful design consideration, the assays can provide complementary and useful mechanistic information on the nature of biochemical interactions (e.g., does the chemical act as an agonist or antagonist of a target receptor).

Fig. 3.1
figure 1

Schematic presentation of the main differences among animal-based, cell-based and cell-free studies (Adapted from Englebienne (2005))

Here we briefly describe the steps involved in running a common cell-free assay, and focus on radioligand binding to a neurochemical receptor (Fig. 3.2). While assays may be permitted on other organ systems, we focus on the nervous system and draw upon examples based on previous work by our group (Basu et al. 2009; Rutkiewicz et al. 2011; Arini et al. 2016). Briefly, for receptor binding assays, cellular membranes are isolated by homogenizing cerebral tissues in a 1:10 solution of buffer and then centrifuging the homogenate to isolate pellets, which are then washed and re-suspended frozen until use. When needed, the cellular membranes preparations are thawed and diluted to an optimal concentration, and then added to microplates that contain a glass filter bottom. The membranes are incubated with radioligands specific for the target of interest. The incubation conditions vary dependent upon the particular assay (e.g., length of incubation, temperature, buffers and assay cofactors). Following an incubation period, vacuum is applied to the well thus filtering the bound radioactive ligand (i.e., the receptor-ligand complex is trapped on the filter) from the unbound ligand that passes through the filter. The radioactivity retained by the filter provides an index of binding. Specific binding to receptors is defined as the difference in radioligand bound in the presence and absence of excess amounts of an unlabelled displacer. These assays can next be run in the presence of a test chemical to determine if that substance impairs ligand-receptor interactions. A range of biochemical parameters can be investigated, such as ligand affinity and saturation kinetics, and the inhibitory (or potentiating) effects of a test chemical on such parameters can be quantified.

Fig. 3.2
figure 2

Schematic representation of cell-free receptor binding assays, in presence or absence of a test chemical

A great advantage of cell-free assays is that they are amenable for use from any species from which tissue can be obtained. This is especially useful for ecological species that are difficult to maintain under laboratory conditions or for which there exists limited data. As an example, one gram of brain tissue can yield enough cell-free extract to populate ~5000 wells in standard microplates (~50 plates), which can then be used to study hundreds of test chemicals. Cell-free assays can be performed on field-collected specimens, with many assays being relatively unaffected by post-mortem delays and storage conditions. For example, several components of the cholinergic, dopaminergic, GABAergic and glutamate pathways were found to be stable for several weeks under various storage and temperature conditions (Stamler et al. 2005) and not affected by post-mortem delays of up to 36–72 h (Piggott et al. 1992; Rutkiewicz and Basu 2012).

3.3 Applications of Cell-Free Assays

Cell-free assays have been used in a number of biomedical applications and here we provide select examples. Cell-free assays have been used to study signal transduction via G-protein coupled receptors (GPCRs), the commercial interest of which lies in areas such as drug targeting, high-throughput screening systems and biosensors (Leifert et al. 2005). A unique approach where synthetic biology intersects with toxicology has been in the development of cell-free protein synthesis (CFPS) platforms (Schmidt and Pei 2011). In these systems, proteins of interest are synthesized under controlled conditions in which they can be actively monitored and rapidly sampled (Schmidt and Pei 2011). First developed with E.coli extracts, known as S30 extracts, a current example is Cytomim which is an E.coli cell-free platform can be used to produce protein therapeutics, toxins and other biochemicals that are difficult to make in vivo because of their toxicity or complexity (DeVries and Zubay 1967; Schmidt and Pei 2011). A final example are purified enzymatic systems from fungi and bacteria that have been used to determine catabolism and biodegradation of fluorinated aromatic compounds and provide information on their fate in the environment using nuclear magnetic resonance (Murphy 2007). Together, these examples showcase the breadth and versatility of cell-free platforms. Given the chapter’s objective we restrict the following sections towards the application of cell-free tests towards the toxicological testing of chemicals, particularly for ecological risk assessment. For more information on synthetic biology approaches see Chap. 19.

Arguably the most concerted effort to use cell-free assays has been through by the US EPA’s ToxCast program that was briefly introduced earlier. The cell-free methods in ToxCast have been performed using Novascreen from Caliper Biosciences (Judson et al. 2010; Knudsen et al. 2011). Chemicals were evaluated in approximately 300 signalling cell-free pathways: 77 G-protein coupled receptor (GPCR) binding assays; 32 CYP-450-related enzyme activity assays; enzymatic assays for 72 kinases, 22 phosphatases, 15 proteases, 6 histone deactylases (HDACs), 3 cholinesterases, and 14 other enzyme activities; 18 nuclear receptor binding assays; 20 ion channel and ligand-gated ion channel activities; and 9 transporter proteins, 2 mitochondrial pore proteins, and 2 other receptor types (Kavlock et al. 2012). First, a single concentration of test chemical was run through the assays. Second, a concentration-response assay was conducted for all active and some selected inactive calls. Data from these assays are available online via the ToxCast Database. Toxicity signatures from ToxCast are defined and evaluated by how well these in vitro signals predict adverse outcomes in toxicity pathways relevant to human health. It is hoped that molecular initiating events, as realized via in vitro results, may be predictive of apical outcomes relevant to the whole organism. Some ToxCast studies have paid specific attention to making such in vivo and in vitro comparisons. For example, Knudsen et al. (2011) ran 292 high-throughput cell-free assays to evaluate 320 environmental chemicals. In vitro data from acetylcholinesterase assays were compared to in vivo data available in the literature for rats and humans. A qualitative association between in vitro and in vivo activity was evident for 16 of 17 (94%) chemicals studied and so the authors concluded that, to a reliable extent, in vitro generally predicted the in vivo situation. Silva et al. (2015) compared GABA(A) binding, dopamine binding and AChE activity after in vivo and in vitro exposure to two pesticides (endosulfane and methidathion). This study showed good concordance between in vitro and in vivo results for dopamine pathways with endosulfan exposure. However, in other cases in vitro results were less representative of in vivo effects. The authors showed that some in vitro assays from ToxCast resulted in false negatives in several critical endpoints. For instance, there is a strong body of evidence in the literature relating endosulfan exposure to estrogenic and anti-androgenic effects in vivo, including receptor binding, whereas endosulfan was reported as being active only in a minimal number of ToxCast assays (Silva et al. 2015). The authors suggested that the discrepancy between in vivo and in vitro responses was likely due to a lack of metabolic activation and limitations in assay design. ToxCast was designed as a collaborative effort and hence, discrepancies could also have resulted from the different analytical approaches or different assay types used by the different collaborating teams to interpret the data, and this could affect how a chemical is defined as having a positive or negative effect.

Cell-free assays have been extended to studying wild, native species not conducive to lab-based experimentation, and the outcomes of some studies are briefly reviewed here. The inhibition potential of inorganic and methyl mercury (HgCl2 and MeHgCl) on muscarinic cholinergic (mACh) receptor binding from both ecological (mink, river otter) and biomedical (humans, rats, mice) tissue samples, was characterized in two brain regions (cerebral cortex and cerebellum) thus resulting in rich concentration-response data across organisms (Basu et al. 2005). The work showed that, across all species, that inorganic mercury was a more potent inhibitor of muscarinic receptor binding than organic mercury, and that the cerebellum was more sensitive than the cerebral cortex. Species-sensitivity could be determined and from most to least sensitive as: river otter > rat > mink > mouse > humans. The mean IC50 value (concentration that inhibits receptor binding by 50%) between the most and least sensitive species ranged from 5-8x. A follow-up study was performed on cortical tissues from ringed seals to show that mercurials but not several organochlorines (e.g., PCBs, toxaphene, DDT, dieldrin) inhibited muscarinic cholinergic receptor binding (Basu et al. 2006). Another follow-up study documented that the M1 muscarinic receptor subtype was more sensitive to mercury-associated inhibition that than the M2 subtype (Basu et al. 2008). Taken together, these studies demonstrate that cell-free assays are potentially useful in studying chemical-ligand interactions in native species that are otherwise difficult to study in the lab, such as marine mammals. The work demonstrates that cell-free assays may help resolve differences across species and chemicals.

Cell-free in vitro systems may also be useful in screening real-world samples, including complex mixtures. In a study concerning pulp and paper mill effluents, goldfish brains were homogenized and cell-free preparations were exposed to primary and secondary effluent extracts (Basu et al. 2009). The results showed that the extracts contained neuroactive substances that could alter the specific binding to several receptors and the activity of enzymes involved in the reproductive signalling. For instance, some extracts increased ligand-binding to Dopamine-2 (D2) and GABA(A) receptors, whereas others competed with the N-methyl-D-aspartic acid (NMDA) and muscarinic cholinergic (mACh) receptors and decreased their binding by 26–75%. Activities of the monoamine oxidase (MAO) and the acetylcholinesterase (AChE) were the most impacted with enzyme inhibition reaching 50%. The authors concluded that these cell-free assays provide a novel in vitro tool to highlight the plausible mechanism by which pulp and paper mills effluents may impair fish reproduction by interacting with neurotransmitter systems. In addition, these in vitro data were used to model potential effects at the level of the whole organism (Chap. 16). A similar approach was taken on wastewater effluents from an Area of Concern (AOC) in the Great Lakes region of North America (Arini et al. 2016). In this case two parallel approaches (in vivo and in vitro) were used to assess how the exposure to wastewater treatment plant (WWTP) effluents or to extracts targeting different classes of chemicals (steroid hormones, nonylphenols, bisphenol A) could impact neurochemistry in fathead minnow (Pimephales promelas). The ability of the wastewater (in vivo) or extracts (in vitro) to interact with enzymes (monoamine oxidase (MAO) and glutamine synthetase (GS)) and receptors (dopamine (D2) and N-methyl-D-aspartate receptor (NMDA)) involved in dopamine and glutamate-dependent neurotransmission were examined on brain homogenates. In vivo exposure of FHM led to significant decreases of NMDA receptor binding in females and increases of MAO activity in males (2.8–3.2-fold). In vivo and in vitro results for FHM were consistent in some cases (but not in all cases). The main correlation was found for MAO activity that increased after both in vivo and in vitro exposures to steroid hormones-targeted extracts from the WWTP.

3.4 Concluding Remarks

Cell-free assays provide a simple in vitro tool to characterize the interaction between test chemicals and biochemical targets, and ultimately these tools can be used to prioritize, screen and evaluate a large number of chemicals (i.e., thousands) in a relatively short period of time (days to weeks). Such has been shown via the U.S. EPA’s ToxCast program, in which cell-free assays are an important component. Studies more oriented towards ecological risk assessment are beginning to show that cell-free assays can be used to study a range of fish and wildlife, and also screen single chemicals and complex mixtures of environmental samples.

There are several potential advantages of cell-free assays. Cell-free assays can be developed on cell components from potentially any vertebrate, and thus are species agnostic and may be of interest for organisms that are at-risk or difficult to maintain in captivity. The data from cell-free assays can be used to inform risk assessment and to provide additional evidence for read-across to toxicologically similar chemicals. It can ultimately result in generating large databases and strengthening decision-making and environmental management.

The assays are amenable to a high degree of automation, and scalable to high-throughput screening. These types of assays can be run in a relatively rapid manner and at a fraction of the cost associated with animal bioassays. Certain cell-free assays can attain a high level of reproducibility, specificity, and sensitivity. When assays are strung together into a systems/pathway-based manner, the assay results may yield plentiful quantitative concentration-response data that may be used to develop predictive models. This information may help develop hypotheses (e.g., candidate toxicants, sensitive pathways) to be further tested via animal models and may also enable inter-species differences to be uncovered.

Cell-free assays characterize simple interactions between a molecular target and a contaminant, and such an interaction may be considered a molecular initiating event which represent the first sequence of events in an adverse outcome pathway (Landesmann et al. 2013; Ankley et al. 2010). For example, the toxic actions of domoic acid are mediated via its agonism of kainate receptors (Watanabe-Sailor et al. 2011), and so this first key molecular initiating event could be developed into a cell-free assay for the purposes of predictive ecotoxicology.

Despite the aforementioned advantages, as with any technology or method there exist limitations. Foremost among them is that the assays represent a simplistic biological system. They lack the requisite cellular machinery found in traditional in vitro methods such as cell lines and cell cultures, yet one may argue that they represent more meaningful models than can be achieved in silico. They lack the metabolic capacity of cells though future endeavours could aim to increase their realism via co-incubations with biological cofactors (e.g., S9 fractions). Moving forward, validation studies that enable comparisons between data from cell-free assays and physiological responses from the whole organism are required to establish these in vitro testing tools as reliable models.