1 Introduction

Water that is free from contamination is essential to human health and is defined as a basic human right by the World Health Organization (WHO 2017). However, the provision of safe drinking water continues to be a major public health concern throughout the world. Approximately 159 million people still collect drinking water directly from untreated surface water sources that may be contaminated by sewage, pathogens, and industrial effluent (WHO and UNICEF 2017). Diarrheal diseases associated with poor water sanitation remain a leading cause of death in the developing world (WHO 2009). Fortunately, the majority of the world does not face these concerns because of drinking water disinfection, an essential practice to eliminate pathogenic microorganisms and prevent waterborne disease. The use of chlorine to disinfect municipal drinking waters has largely eliminated the threat of cholera, typhoid, and other waterborne disease outbreaks (Centers for Disease Control and Prevention 2019).

Disinfection practices eliminate the threat of waterborne pathogens, but constant monitoring and diligence is necessary to ensure that water treatment is effectively eliminating this risk. While the risk of waterborne pathogens is typically associated with developing countries, one of the most documented cases illustrating the causes and consequences of contaminated drinking water occurred in Walkerton, Ontario, Canada in 2000. By the end of the outbreak, the rural community located outside of Toronto would bury seven community members while thousands suffered from gastrointestinal illness. Researchers from the water disinfection community were shocked: How could an incident like this occur in a wealthy, technologically advanced nation?

The source of the issue arose from the three wells used by the community to obtain its drinking water and that were regularly treated with chlorine disinfection. At the beginning of May 2000, Walkerton experienced extreme rainfall that led to flooding around the wells, contaminating their primary source of drinking water. With proper chlorine disinfection and residual chlorine in the distribution system, this contamination would likely not have caused any adverse health effects to the people of Walkerton. However, improper measurement of residual chlorine levels and an uninstalled chlorine dosing meter on one of the wells prevented a corrective response, leading to significant microbial contamination throughout the entire distribution system of the town. On May 17, 2000, microbiological test results of water samples showed the presence of total coliforms and Escherichia coli in the water of the town’s distribution system. The test results were ignored, and the contamination was not reported. Within days, the first person in the community had died, and by the end, several more fatalities were recorded. An estimated 2300 peoples suffered from gastrointestinal illness, in many cases severe. The following inquiry identified many contributing factors that led to the outbreak, including inadequate chlorination, organizational negligence, and negligence by operators and managers. This event showed that a community like Walkerton, which is in a developed country and has a well-established treatment facility for drinking water, is not immune to outbreaks when the water treatment process is not properly controlled, managed, or routinely monitored (Hrudey et al. 2014). The lesson learned is that drinking water safety is not a developing world issue but a global challenge.

By reducing the acute risk posed by microbials, the chlorination of drinking water remains one of the greatest public health achievements. However, the Walkerton case study shows that the threat of waterborne disease is constant, even in developed nations, and water treatment processes need to be constantly monitored and evaluated to ensure global drinking water quality. In addition to the acute risk of pathogens, water treatment must also deal with disinfection by-products (DBPs), which present a chronic health risk. To ensure safe drinking water, a delicate balance should exist between microbial (pathogens) and chemical (DBPs) risks, as illustrated in Fig. 1. This chapter will discuss some of the challenges affecting water treatment using case studies from around the world. Special attention will be paid to the greatest chemical concern of water treatment, DBPs, and its influencing factors, including source water composition, water treatment processes, and water distribution systems.

Fig. 1
figure 1

Used with permission, further usage requires ACS permission. https://pubs.acs.org/doi/10.1021/acs.est.7b05440. (Li and Mitch 2018)

Trade-off between acute microbial risk and chronic chemical risk.

2 A Historical Perspective: The Discovery and Regulation of Disinfection By-Products

Although the goal of disinfectant use is to kill harmful pathogens, one of the inevitable consequences is the production of DBPs. These compounds are formed as by-products of reactions between disinfectants and natural organic matter (NOM) present in the source water. DBPs were first identified in 1974 by analytical chemists, who found trihalomethanes (THMs) in finished drinking water (Rook 1974; Bellar and Lichtenberg 1974), a discovery aided by improvements in analytical instrumentation and sample preparation methods (Kristiana et al. 2012). Up until this time, it was assumed that drinking water treatment was sufficient to provide consistently safe water, as disinfection had led to the near eradication of major waterborne disease epidemics. In this context, Rook’s discovery of THMs was truly revolutionary and showed how little was known concerning the chemistry of drinking water treatment.

The discovery of THMs was independently verified by U.S. Environmental Protection Agency (EPA) scientists, who incorrectly believed that the precursor responsible for their formation was ethanol, a compound with likely minimal occurrence in natural waters. The widespread use of chloroform (a type of THM) in consumer products at the time further complicated matters, and ultimately the landmark findings were ignored by the agency. However, growing evidence suggested that NOM was the primary precursor for THMs and that THMs had widespread presence in chlorinated drinking water (Symons 1975). A rodent cancer bioassay study from the National Cancer Institute also found that chloroform exposure induced tumors in mice and rats (National Cancer Institute 1976). Typical to practices at the time, the bioassay performed was designed to reveal any carcinogenic effects and used very high treatment doses (Hrudey 2009). Therefore, the results provided very strong evidence for the formation of chloroform-induced tumors in mice and raised public fears over the safety of its use. Soon after, health concerns associated with chloroform and its parent class, THMs, resulted in the ban of chloroform in cosmetics and the adoption of the first drinking water guidelines addressing DBPs.

In 1978, Canada set a maximum containment level guideline of 350 µg/L for total THM4 (chloroform, bromodichloromethane, dibromochloromethane, and bromoform), becoming the first country to set a THM guideline (Hrudey 2009). The United States (U.S.) followed suit in 1979 with the Total Trihalomethane Rule that limited THM4 in U.S. drinking water to <100 µg/L (Federal Register 1998). This regulation significantly reduced the fraction of U.S. utilities with THM4 >100 μg/L from about 30% to 3% by 1988 (Mcguire 1988). Since then, guidelines and maximum allowable concentration regulations for many DBPs have been regularly updated by several organizations. The current DBP regulations adopted by the U.S., Canada, the European Union (EU), China, and the WHO are found in Table 1.

Table 1 Current maximum containment levels (mg/L) for DBPs

2.1 Epidemiology and Unknown DBPs

After the initial finding on the carcinogenicity of chloroform, several epidemiological studies were initiated to determine whether consumption of disinfected drinking water increases excess cancer risk due to the now known presence of chemical by-products. Although insufficient evidence has been found to determine the carcinogenicity of individual DBPs, a consistent association between urinary bladder cancer and consumption of chlorinated water has been observed. A summary of these epidemiological studies, and more information on the risk assessment of DBPs, can be found in publications prepared by Dr. Steve Hrudey (Hrudey 2009; Hrudey et al. 2015). In general, although these findings are consistently found, the strength of association is typically weak. However, a small increase in risk coupled with the vast exposure to chlorinated drinking water could become a significant public health issue. In addition to the bladder cancer risk associated with treated drinking water, epidemiological studies have observed associations between consumption of chemically treated water and adverse reproductive outcomes (Hrudey 2009). One of the initial concerns associated with disinfected water was spontaneous abortion, but this association has been dispelled (Savitz et al. 2006). While several reproductive health outcomes have been found to have no, low, or inconsistent association with disinfected drinking water, positive associations were found for impaired fetal growth, as measured by low birth weight and small body length or head circumference (Tardiff et al. 2006).

Since the mid-1970s, over 600 DBPs have been identified in treated drinking water (Richardson 2011), spurred by advancements in analytical instrumentation, as well as changes to water treatment inputs and water treatment technology. Gas chromatography (GC) has been in frequent use for the identification and quantification of volatile and semi-volatile DBPs since the identification of THMs. Although electron impact is the most common ionization source for GC, the use of softer ionization techniques such as chemical ionization may allow for the discovery of novel DBPs. For detection, the use of electron capture is standard for many U.S. EPA methods for DBPs. However, the use of mass spectrometry (MS) for detection is often preferred due to its increased selectivity, allowing for both targeted and non-targeted analysis. Additionally, the use of triple quadrupole-MS with multiple-reaction monitoring (MRM) allows for increased sensitivity and selectivity for targeted quantification. For example, this technique was used for the quantification of 14 nitrosamines at nanogram per liter levels in tap waters (Qian et al. 2015). Ionization sources for MS detection are varied, and are selected on the basis of the analytes of interest. Electrospray ionization (ESI) is a commonly applied ionization source; however, the less common atmospheric pressure chemical ionization (APCI) is useful for analysis of less polar DBPs. Because DBPs occur within a complex mixture of finished water, advances in the development of separation techniques have greatly improved the ability to identify novel DBPs within a complex matrix. The application of high-performance liquid chromatography (HPLC) has increased the characterization of the unknown total organic halide (TOX), as the technique is suitable for analysis of polar, high-molecular weight, and thermally labile DBPs. Currently, the most promising technique for the identification of unknown DBPs is the use of high-resolution mass spectrometry (HRMS), such as quadrupole-time-of-flight mass spectrometry (QTOF-MS), due to its ability to very accurately determine the mass of analytes. For more information on these analytical methods, a recent review from Yang and colleagues provides a comprehensive review on current DBP analysis and instrument trends (Yang et al. 2019).

While improvements in analytical chemistry have allowed for the identification of many new DBPs and DBP classes, the subset that has been quantified only constitutes about 30% of TOX in chlorinated waters, highlighting the importance of continuing to identify new DBPs of toxicological importance (Krasner et al. 2006). A significant focus is still placed on the two most abundant classes of DBPs; THMs and haloacetic acids (HAA), mostly due to the corresponding focus on these compounds by regulations. However, mounting evidence is beginning to suggest that these two classes of DBPs are not sufficiently potent to account for the adverse human health effects associated with DBPs (Plewa et al. 2017; Bull et al. 2011). Understanding the chemistry of DBP formation and how that may change depending on the source water input, disinfection practice, and other water treatment system choices is the key to minimizing the exposure to DBPs in finished drinking water. With this information, water utilities can optimize water treatment to improve drinking water quality from source to tap.

3 DBP Challenges: Changing Source Water

3.1 Case Study: Cape Town, South Africa

After years of concern and warnings about water scarcity, the worst fears for many South Africans came true when severe droughts gripped the nation in 2015 and 2016, decimating their water supply (Ziervogel 2019). To counteract water loss, restrictions on personal water usage were increased in 2017 to a meager 87 L per person per day. Even with these additional measures in place, Cape Town was expected to reach “Day Zero” on April 12, 2018, the day on which all taps would be turned off. As a result, more severe water restrictions were announced to restrict personal water usage down to 50 L per person per day, and large-scale irrigation in agricultural areas was significantly reduced (Ziervogel 2019). Fortunately, the restriction measures, in combination with regional source water supplementation, use of small-scale desalination plants, and increased rains, prevented the arrival of “Day Zero” on April 12, pushing its portended arrival further into the future. In preparation, Cape Town is developing a revised water strategy, adopting new approaches to water management and governance that will increase their preparedness for the impacts of climate change and climate variability. With hotter, drier climates becoming more commonplace throughout many areas of the globe, some populations in sensitive areas, as witnessed in Cape Town, may be forced into a state of crisis management at the expense of long-term development planning.

3.2 Source Water Composition

Globally, the quantity and quality of freshwater sources are in decline (Mekonnen and Hoekstra 2016), leading water utilities to switch or supplement their source water with alternative water sources such as desalinated brackish water or treated wastewater. Additionally, human impact on water sources has led to the presence of emerging contaminants that have been difficult for water utilities to mitigate, as many of these contaminants are toxic or may transform during water treatment. Effluent organic matter (EfOM), algal organic matter (AOM), NOM, and anthropogenic organic material all contribute to the formation of DBPs, with the most productive precursors being the hydrophobic, acidic, and aromatic fractions of NOM. Hence, changing source water composition presents an important challenge for water treatment (Fig. 2), as the precursor pool for DBP formation can be significantly altered.

Fig. 2
figure 2

Complexity of source water composition for water treatment inputs

3.2.1 Alternative Source Water

In many areas, its demand for freshwater surpasses the availability due to rapid population growth and an increased occurrence of drought. The number of people that live in areas with severe water scarcity at least one month of the year is estimated at four billion, or 66% of the world population (Mekonnen and Hoekstra 2016). As a global issue, these people live in areas all around the world from Asia to the Americas. It is expected that meeting the demand for freshwater, both for drinking and water-intensive products, will be one of the most difficult challenges facing humanity this century (Mekonnen and Hoekstra 2016). Hence, the utilization of alternative water sources has become a logical approach to supplement current water supplies (Gude et al. 2010).

3.2.1.1 Seawater

While still costly and energy-prohibitive, saltwater desalination technologies have undergone dramatic improvements in the last 50 years since their initial development, and their use has risen exponentially from that time. From 2007 to 2012, worldwide total installed desalination capacity rose from 47.6 to 74.8 million m3/d, a trend that is expected to continue (Bennett 2013). Utilities in California (U.S.), Australia, Singapore, China, and Saudi Arabia have all implemented desalination as part of their plan to overcome water scarcity (Gude 2016). However, the increasing demand for desalination technology has been accompanied by growing concern regarding the environmental impact of desalination plants, as they require large quantities of energy to function. This energy is often obtained via the use of fossil fuels, leading to increased greenhouse gas production (Gude 2016). Additionally, the waste produced by desalination may pose an environmental hazard due to increased salinity, temperature, and contaminants such as chlorine, copper, and anti-scalants (Roberts et al. 2010). The salinization of water resources is not limited to the use of desalination plants, as natural processes (e.g., seawater intrusion into aquifers) and anthropogenic forcing (e.g., agricultural runoff and wastewater contamination) can also affect salinity. Thus, salinization is a global issue that water treatment processes must also adjust to, regardless of the origin of their source water (Vengosh et al. 2014).

Like surface water sources, seawater sources also require chemical disinfection, but treatment processes require adaptation to prevent undesirable effects during water treatment due to compositional differences. Because seawater contains less total organic carbon (TOC), DBP formation is expected to be lower (Kim et al. 2015). However, seawater typically contains elevated levels of both bromide and iodide, 50,000–80,000 and 21–60 µg/L, respectively (Kim et al. 2015). Total organic bromide (TOBr) and total organic iodide (TOI) have both been found to correlate with increased cytotoxicity and genotoxicity of disinfected water (Kim et al. 2015). In vitro studies have shown that halogenated DBPs follow a general toxicity order of I- > Br- > Cl- (Plewa et al. 2008, 2014; Li et al. 2016), with iodinated DBPs (I-DBPs) more toxic than brominated DBPs (Br-DBPs), which in turn are much more toxic than chlorinated DBPs (Richardson et al. 2003, 2008; Hua et al. 2006; Chen and Westerhoff 2010; Ged and Boyer 2014; Ged et al. 2015). Elevated bromide concentrations can increase the formation of many classes of Br-DBPs, such as Br-THMs and Br-HAAs, as HOBr is a more efficient halogenating agent than HOCl (Ged and Boyer 2014; Westerhoff et al. 2004; Parker et al. 2014). Thus, caution needs to be taken when using source water with elevated bromide and iodide, such as seawater, as the formation of more toxic Br- and I-DBPs is a potential outcome. With increasing use of seawater desalination and the salinization of freshwater resources, further research is required to address how to control the formation of these Br- and I-DBPs in finished drinking water.

3.2.1.2 Potable Water Reuse

Potable water reuse is another solution to concerns over both water scarcity and water quality deterioration. There are two main types of potable water reuse: (1) indirect, which uses an environmental water to buffer the water before treatment, or (2) direct reuse. While the majority of wastewater reuse plants are indirect potable reuse, a growing number of direct potable reuse facilities can be found (Richardson and Kimura 2017). Major water reuse facilities utilizing reclaimed water are in operation in 43 countries around the world (National Research Council 2012). China is also turning to water reuse as rapid development and historical misuse of water resources has left the country facing severe water stress and water contamination issues in many parts of the country. Increasing demand from industry and urban populations has also placed increased pressure on the current water supply (Zhao et al. 2017). In a country with increasing amounts of wastewater and many areas lacking proper wastewater treatment (Sun et al. 2016), potable water reuse has been identified as a potential solution for these water management issues.

One successful example of water reuse is the island nation of Singapore (Lee et al. 2016). Owing to limited options for a freshwater supply, the country was heavily dependent on neighboring countries to meet its water demand. Starting in 2003, the country’s water utilities turned to water reuse to produce reclaimed drinking water, which they called NEWater. Having utilized water reuse to produce non-potable water for industrial uses since 1966, the country now produces reclaimed water that conforms to drinking water guidelines meeting up to 30% of their total water demand. NEWater effectively closes Singapore’s water loop, increasing resiliency and freeing up limited land area by reducing the need for significant water storage.

Like seawater, using treated wastewater as source water for water treatment has implications for DBP formation, as wastewater contains a set of precursors that are fundamentally different from NOM. The characteristic organic matter derived from wastewater impacted water is commonly defined as EfOM. EfOM has been shown to be a precursor source for carbonaceous DBPs such as THMs or HAAs (Krasner et al. 2009), although its lower aromaticity (less reactive compounds) in comparison to NOM should result in decreased THM production (Li and Mitch 2018). However, EfOM does contain higher organic nitrogen (Westerhoff et al. 2002), which can promote the formation of nitrogen containing DBPs (N-DBPs). N-DBPs have been shown to be more cytotoxic and genotoxic than their carbonaceous analogs in in vitro mammalian cell assays (Plewa et al. 2008; Bond et al. 2011; Krasner and 2009; Muellner et al. 2007). Thus, removal of DBP precursors from potable water reuse is a unique challenge for water utilities. While the removal of THM precursors is similar among treatment types, nitrification processes are required for potable water reuse to remove precursors for N-DBPs, including haloacetonitriles (HANs), N-nitrosodimethylamine (NDMA), and trihaloacetaldehyde (Krasner et al. 2009; Shah and Mitch 2012). Many potable water reuse plants also use reverse osmosis (RO), which is effective for removing many compounds, especially charged compounds and those larger than 200 Da in size.

3.2.2 Human Impacts on Source Water

The use of alternative water sources such as saltwater or wastewater illustrates the challenges associated with intentional changes in source water. However, it is important to recognize that challenges can arise in source water that has already been successfully used for drinking water treatment when unintentional human impacts influence source water composition.

3.2.2.1 Nutrient Loading and Algal Blooms

Increased nutrient loading into source waters due to anthropogenic activities such as agricultural runoff and wastewater discharges has led to an increased occurrence of algal blooms, which in many cases have become annual events such as at Lake Erie in North America (Michalak et al. 2013). Lake Taihu in China also experiences annual phytoplankton blooms due to excessive nutrient loading caused by human inputs (Chen et al. 2003; Duan et al. 2009). During one particularly massive bloom of cyanobacteria in 2007, attempts to clear the bloom unintentionally funneled it directly into the drinking water treatment plant of the city of Wuxi. This caused a public health emergency, leaving nearly two million people without clean drinking water (Qin et al. 2010; Guo 2007). Thus, algal bloom formation is a growing concern due to its association with mortality across a range of biota, economic impacts through ecological and human health costs, and the need for additional water treatment measures (Hoagland et al. 2002; Hoeger et al. 2005; Landsberg 2002). The human health concerns regarding algal blooms are typically due to the presence of microcystins produced by some species of Microcystis (Fu et al. 2015). In Lake Taihu and Lake Erie, Microcystis species have been shown to be the dominant species of algal blooms (Michalak et al. 2013; Deng et al. 2014), and significant concentrations of toxic microcystins, including MC-LR, have been found (Song et al. 2007; Sakai et al. 2013). The most successful strategy for prevention of algal blooms in source waters is to limit the presence of excessive nutrients, such as phosphorous, through regulatory policies preventing eutrophication (Ibelings et al. 2016). However, in situations where this is not possible or in response to an already present algal bloom, there are techniques that can be applied on site to control or mitigate bloom growth. These techniques include compartmentalization, removal of biomass through chemical or physical measures, and flushing or mixing (Stroom and Kardinaal 2016). Removal of blooms prior to arrival at the water intake is particularly important since algal cells cause physical issues with settling and clogged filters or membrane fouling (Fang et al. 2010a).

Algal-impacted source water provides an interesting example of the potential interactions between microorganisms and DBPs, as AOM present during water treatment has been shown to impact DBP formation. The formation of both THMs and HAAs has been reported from the chlorination of algal cells (Plummer and Edzwald 2001). Like wastewater-impacted water, AOM exhibits higher organic nitrogen content (Fang et al. 2010a). AOM also contains more hydrophilic and less aromatic carbon content and greater structural diversity when compared to NOM (Her et al. 2004). These differences in composition mean that AOM produces more N-DBPs, haloaldehydes (HALs), and less carbonaceous DBPs than chlorination of NOM (Fang et al. 2010a). In contrast, during chloramination, most DBPs were formed in smaller quantities in AOM than in NOM (Fang et al. 2010b). AOM adds an additional level of complexity in that the composition of algal proteins, carbohydrates, and lipids changes depending on the growth stage of the algae; meaning that DBP formation also depends on the algal growth stage (Fang et al. 2010a; Brown et al. 1993). As mentioned earlier, some species of algae also produce algal toxins, such as toxic variants of microcystins, which have also been shown to be precursors to DBPs. Chlorination of the algal toxin MC-LR was shown to produce many different classes of DBPs, including THMs, HALs, and HANs (Chu et al. 2017).

3.2.2.2 Emerging Contaminants

Owing to their low concentration in the environment, emerging contaminants (ECs) are often referred to as micropollutants or microconstituents. However, continuous improvement of analytical techniques has allowed for the detection of an increasing number of these contaminants in environmental waters (Richardson and Kimura 2017; Richardson and Ternes 2018). Disinfectants readily react with many ECs during treatment processes to form transformation products, classified as pollutant DBPs. Recent studies have found the formation of pollutant DBPs from pharmaceuticals (Negreira and Regueiro 2015; Carpinteiro et al. 2017), brominated flame retardants (Gao et al. 2016; Nika et al. 2017), surfactants (Gong et al. 2016), recreational drugs (Saleh et al. 2019; Mackie et al. 2017), and ultraviolet light (UV) filters (Trebse et al. 2016). In many cases, these transformation products have been shown to be more toxic or biologically active than the parent contaminant, emphasizing the importance of the removal of these precursors prior to treatment (Richardson and Ternes 2018).

ECs are a difficult issue for water treatment because of their large variation in chemical, biological, and physical properties that affect their ability to be removed during the treatment process. A number of studies have investigated the removal of micropollutants using oxidation strategies (Lee and von Gunten 2010). In general, ozonation can remove many pharmaceuticals (Ternes et al. 2003), whereas chlorine dioxide is not able to remove some of the most persistent pharmaceuticals such as ibuprofen (Huber et al. 2005). Advanced oxidation processes (AOPs) using combinations of UV, hydrogen peroxide (H2O2), and ozone have been shown to remove many micropollutants such as pharmaceuticals and pesticides, with a high removal efficiency (Kim et al. 2009). The use of membrane-based techniques such as nanofiltration and RO is highly effective in the removal of many micropollutants, including X-ray contrast media, pharmaceuticals, and per- and polyfluoroalkyl substances (PFAS) (Snyder et al. 2007; Radjenović et al. 2008; Drewes et al. 2005; Kimura et al. 2003; Tang et al. 2006). Typically, a multi-barrier approach instead of a single engineering process is necessary to remove ECs because of their diversity. When considering what water treatment processes should be used, the cost of construction and maintenance is also an important factor. Richardson and Kimura summarize the cost of many water treatments in reference to EC removal (Richardson and Kimura 2017).

4 DBP Challenges: Changing Water Treatment

4.1 Case Study: N-Nitrosamines

In response to the initial regulations on THMs and HAAs, water utilities sought new technologies that would allow them to meet DBP guidelines while eliminating microbial risk. Because THMs and HAAs primarily form through reactions between chlorine and humic substances, chloramination limits the production of these DBPs. As a result, many utilities switched to chloramine as the primary disinfectant. In addition to low THM and HAA formation, chloramine was also found to be useful for maintaining a disinfectant residual in the distribution system (Seidel et al. 2005). However, while chloramine reduced the risk of regulated DBPs, this switch also had unintended consequences for many water utilities attempting to balance microbial and chemical risks. For example, although a potent antibacterial, chloramine use has been associated with increased levels of mycobacteria (Rhoads et al. 2017). More importantly, the switch to chloramine also leads to a significant trade-off in the chemical risk of DBPs produced.

In 1989, the N-nitrosamine, N-nitrosodimethylamine (NDMA), was first identified as a DBP after being detected in treated drinking water in Ohsweken, Ontario, Canada (Taguchi et al. 1994). While originally thought to have come from an anthropogenic source, further experiments confirmed that its presence was a direct result of the chloramine used in the water disinfection process, with formation requiring nitrogen derived from precursors in the source water, referred to as dissolved organic nitrogen (DON), or from the disinfectant itself (chloramine) (Bond et al. 2011). As such, chloramination has been shown to promote the formation of nitrosamines (Zhao et al. 2008; Schreiber and Mitch 2006).

Since the discovery of NDMA, a total of seven N-nitrosamines have been identified as DBPs and have been detected in several other locations, including California (U.S.), Alberta (Canada), Japan, China, and the U.K. The discovery of NDMA as a DBP was important, as the toxicity of N-nitrosamines at the time had been well documented due to their presence in foods, beverages, and consumer products (Scanlan and Issenberg 1975; Rostkowska et al. 1998). In vivo animal studies have shown that nitrosamines are potent carcinogens, inducing cancer in every major tissue in laboratory animals, and are suspected human carcinogens (International Agency for Research on Cancer 1978; Magee et al. 1967). While the nitrosamines themselves are not carcinogenic, they are bioactivated within the body, forming a hydroxyl radical which can methylate macromolecules such as DNA (Liteplo et al. 2002). N-DBPs, including N-nitrosamines such as NDMA, are widely considered to be more genotoxic and cytotoxic than currently regulated DBPs. By switching from chlorination to chloramination, water utilities inadvertently caused the formation of new, highly toxic DBPs in finished water.

The history of N-nitrosamines provides an excellent example of how changes in drinking water practice can have unintended effects on the formation of DBPs. In order to meet the limits on THMs and HAAs, such as those set by the Stage 1 and 2 Disinfectants and Disinfection By-products Rules, utilities are switching from sole reliance on chlorine or chloramine disinfection to combinations of primary disinfectants (ozone, UV, or chlorine) with chloramines as secondary disinfectants (Seidel et al. 2005; Dotson 2019). In order to deal with changes to source water, new physical treatment processes are also being developed and have seen widespread use.

4.2 Alternative Treatment Methods

Alternative treatment methods are being employed to balance the chemical risks associated with traditional chemical treatment methods of chlorination and chloramination, as well as the physical treatment methods of coagulation, sedimentation, and filtration. Since THMs and HAAs are associated with the use of chlorine disinfection and are the focus of many DBP regulations, water utilities are experimenting with alternative treatment methods, both chemical and physical.

4.2.1 Chemical Treatment Methods

While each disinfectant has benefits and drawbacks associated with its use (Table 2), all disinfection schemes form DBPs. Owing to the hazards associated with known DBP formation from traditionally used disinfectants, ozone and UV disinfection are becoming increasingly popular for use.

Table 2 Advantages and disadvantages of selected disinfection processes (Ireland Environmental Protection Agency 2011; WHO 2017; Washington State Department of Health 2019)
4.2.1.1 Ozone

A strong oxidant, ozone is capable of oxidizing most organic and inorganic chemicals and can be used as a primary disinfectant (von Gunten 2003a). It is produced on site at water treatment plants by passing oxygen or dry air through a high-voltage electric field (WHO 2017). Although a powerful disinfectant, care must be taken to monitor bromate in finished water, as ozone promotes bromate formation through oxidation of naturally occurring bromide (von Gunten 2003b). The formation of bromate depends on several factors: pH, concentration of bromide, concentration of ozone, and contact time (WHO 2017). Hence, bromate formation can be minimized by operating at lower pH (e.g., pH 6.5), using lower ozone doses with shorter contact time, and with the addition of ammonia which blocks bromate formation pathways by reacting with HOBr (von Gunten 2003; Pinkernell and von Gunten 2001). An important limitation of ozone is that it does not provide a residual disinfectant within the distribution system due to its short half-life (WHO 2017). Thus, a secondary disinfectant such as chlorine or chloramine must be added to maintain a disinfectant residual. Ozone use can also result in the formation of halonitromethanes (McCurry et al. 2016) and haloacetaldehydes (Shah et al. 2012) when coupled with chlorination or chloramination for secondary disinfection.

4.2.1.2 UV Disinfection

The use of UV disinfection as an alternative to chemical disinfection dates back to the early 1900s. UV damages the nucleic acids of cells or viruses, preventing their multiplication (Hijnen et al. 2006). Thus, UV treatment is broadly effective against pathogens and is particularly effective against Cryptosporidium, a pathogen highly resistant to chlorination (Hijnen et al. 2006). Unlike free chlorine or chloramine, UV disinfection does not produce halogenated DBPs during primary disinfection (Dotson and Rodriguez 2012). Although some DBPs such as aldehydes and carboxylic acids have been identified as products of UV disinfection, the concentration and identity of these non-halogenated DBPs have been of little concern (Liu et al. 2002; Dotson and Rodriguez 2012). It is important to note that turbidity and dissolved substances can inhibit UV disinfection (WHO 2017; Dotson and Rodriguez 2012), and that similar to ozone, a secondary disinfectant must be added to maintain disinfection residual in the distribution system when used as a primary disinfectant. Although UV disinfection is not associated with formation of notable DBPs, the fragmented UV photolysis products of NOM can be DBP precursors when followed by chlorination or chloramination (Liu et al. 2006; Shah et al. 2011).

To avoid elevated formation of DBPs associated with individual disinfectants, the use of carefully optimized combinations of disinfectants may be able to provide water that is free from pathogens and has minimal chemical risk. For example, the use of a pre-oxidant, such as chlorine or ozone, with post-chloramination may be able to reduce the formation of NDMA (Shah and Mitch 2012). However, this still comes with a trade-off, as DBPs associated with the pre-oxidant will be produced. Because every water source is unique and regulations vary between regions, each water utility must consider their own situation when choosing which disinfectant(s) to use.

4.2.2 Physical Treatment Methods

Traditional physical treatment methods used during water treatment processes can also increase the formation of DBPs (Ding et al. 2019). Coagulants, biological filtration, and adsorbents can all act as DBP precursors. Coagulants are used to promote aggregation or precipitate formation from the organic matter present in source water. However, many of these compounds and their monomers contain amide and amine groups that can react with disinfectants to form N-DBPs (Bolto and Gregory 2007; Krasner et al. 2013). Biofiltration has been shown to reduce the amount of DBP precursors (Liu et al. 2017), but more recent work has shown that this process can also increase the formation of N-DBPs under certain conditions due to the release of precursors from the biofilter, including biomass and cationic polymers (Chu et al. 2011, 2015). Activated carbon is commonly used to adsorb contaminants, including DBP precursors, but this process poorly removes DON and bromide ions, leading to increased formation of N-DBPs and Br-DBPs (Chiu et al. 2012; Symons et al. 1993a; Krasner et al. 2016; Zhang et al. 2017). Fullerene is another potential adsorbent for drinking water treatment that has been identified as a possible DBP precursor (Wang et al. 2012a; Alpatova et al. 2013). In response to the challenges affecting source water such as water reuse and desalination, many utilities have opted for the use of advanced physical treatments, including RO and granular activated carbon (GAC). Although costly, these methods have had demonstrated improvement in water quality.

4.2.2.1 Reverse Osmosis

RO works by using pressure to force water through a semi-permeable membrane which rejects dissolved matter based on size, charge, and physico-chemical interactions (Radjenović et al. 2008; Bellona and Drewes 2005). The indirect potable water reuse system in the Orange County (California) Water District was one of the first systems to use RO. Since its inception in 1977, the advanced treatment plant has treated wastewater for the purpose of injecting it into aquifers to counteract seawater intrusion. Originally, the water flow was split between RO and GAC due to the high cost of RO. When the system was expanded in 2008, an integrated membrane system with RO and microfiltration was installed to treat the entire water flow (Marron et al. 2019). An AOP using UV and H2O2 was later installed when NDMA and 1,4-dioxane were found in the treated water. Today, this updated system is called the Groundwater Replenishment System and currently treats 379,000 m3/d (Marron et al. 2019). This project is one of the longest running advanced treatment plants and has significant community support, leading to its use as a model for future multiple treatment systems around the world (Harris-Lovett et al. 2015).

While effective, there are some concerns regarding the use of RO that must be addressed prior to its implementation by a water utility. Because of its efficient removal of small molecules and charged particles, RO water may be missing nutritionally important ions. Ion-free water can also enhance the rate of unwanted mineral dissolution, such as dissolving iron oxide layers on the inside of pipes. However, the addition of essential ions into finished water may overcome some of these issues (Sedlak 2019). Another issue associated with RO and other membrane-based techniques that can impact the chemical risk of treated water is the ability of small, non-charged, and hydrophilic contaminants such as DBPs to pass through the membranes (Linge et al. 2013). If produced by disinfection upstream of RO treatment through pre-oxidation, many neutral and low molecular weight DBPs, such as di-HANs are poorly rejected by RO (Linge et al. 2013; Agus and Sedlak 2010). Although not as effective as RO, nanofiltration has been proposed as an alternative to RO treatment because it requires less pressure, and thus less energy. It also produces a smaller volume of brackish concentrate due to the ability of some monovalent salts to pass through (Bellona et al. 2012). Ozone and biologically activated carbon is another alternative treatment method as it requires less energy and capital cost compared to RO (Marron et al. 2019).

4.2.2.2 Granular Activated Carbon

Another advanced physical technique for water treatment systems is GAC. GAC has been used for many years to reduce NOM precursors for THMs and HAAs prior to disinfection (Chiu et al. 2012; Krasner et al. 2016; Knappe 2006; Summers et al. 2010). While effective for controlling the formation of regulated DBPs, the use of GAC can increase the ratio of DON:dissolved organic carbon (DOC) and Br:DOC as the treatment preferentially removes DOC over DON and does not remove bromide (Chiu et al. 2012; Symons et al. 1993b; Summers et al. 1993). Therefore, the use of GAC may result in the increased formation of N-DBPs and Br-DBPs. Nonetheless, while these compounds are typically more cytotoxic and genotoxic compared to THMs and HAAs, a study has shown that the overall cytotoxicity and genotoxicity of GAC-treated waters was reduced by 32–83% in comparison to water treatment without GAC (Cuthbertson et al. 2019).

Increasing use of AOP for treatment is an important point of consideration for utilities that plan to use them in connection with advanced physical treatment methods such as RO or GAC. AOPs are often employed to reduce the concentration of NDMA, but many of the same compounds poorly removed by RO are also not removed to an appreciable extent using these treatments. For example, chloroform has a low reactivity with hydroxyl radicals and does not undergo direct photolysis, meaning that a much higher UV dose than would be cost effective would be necessary to remove it (Marron et al. 2019). Also, since AOPs typically lead to transformation products instead of complete mineralization, some products may be more toxic than the parent compounds. While these are important considerations that need to be studied, the toxicity-weighted concentration of DBPs is typically lower in recycled water produced using RO-UV/H2O2 treatment than with conventional drinking water treatment (Zeng et al. 2016; Szczuka et al. 2017, 2019). However, incorporation of GAC may be a more functional alternative for utilities. Studies have shown that DBP precursors formed during pre-oxidation within the treatment plant are effectively removed by GAC (Bond et al. 2012; Kimura et al. 2013), sometimes more easily than the original precursor (Jiang et al. 2017).

5 DBP Challenges: Changing Distribution Systems

5.1 Case Study: The Netherlands

In the Netherlands, water used to produce drinking water is carefully selected (Smeets et al. 2009). The country has vast sandy aquifers that provide microbiologically safe ground water. Following World War II, natural replenishment of some of these aquifers could not support urbanization, and therefore some areas in the country were forced to use surface water sources to supplement their source water. This was initially treated with standard physical processes (coagulation, sedimentation, and filtration), as well as use of chlorine as a disinfectant. However, Rook’s discovery of THMs led to the immediate abandonment of the use of chlorine disinfection whenever possible.

The improved physical treatments and optimization of chlorination conditions led to a reduction in required chlorine. Eventually, other treatment methods including ozonation with GAC or UV were optimized to replace chlorination. However, chlorine residual was still applied in certain situations to prevent bacterial growth in the country’s water distribution system. With the ultimate goal of achieving a chemical-free treatment and distribution system, water utilities in the Netherlands shifted their focus from water treatment to the production of biostable water. A disinfectant residual was used in order to suppress regrowth of bacteria; however, it was predicted that if treated water was clean enough, utilities could effectively “starve” any bacteria present in the distribution system, preventing regrowth. Their goal of chemical-free water distribution was reached in 2008, having slowly reduced the amount of residual disinfectant present until no chlorine was required to be applied at all.

This unique approach to water distribution by the Netherlands relies on several key aspects. First and foremost, the chosen source water must be of high quality, and the strict protection of source water is necessary to achieve this. In order of preference, the source water should be (1) microbiologically safe ground water, (2) surface water with a soil passage, or (3) surface water utilizing multiple barriers for treatment. Next, a physical process treatment is used, and if it cannot be avoided, oxidation by ozone or peroxide. When considering the distribution system, it should not allow the entry of contamination and should be routinely monitored for failures. Finally, the water should be biologically stable and only biostable materials should be used to prevent microbial growth within the distribution system. Smeets and colleagues provide an in-depth explanation of the production and legal requirements of the water treatment process in the Netherlands (Smeets et al. 2009). While this approach significantly reduces the risks associated with DBPs, it also requires significant capital to build and maintain the physical treatment processes and distribution system. Nevertheless, the water distribution system used in the Netherlands is an excellent example of the careful optimization of conditions, both before and after treatment, to ensure safe, chemical-free drinking water. While chemical-free treatment and distribution may be a lofty goal for most utilities, overcoming challenges associated with residual disinfectant and aging infrastructure is a shared global challenge.

5.2 Chemical, Biological, and Physical Challenges in the Distribution System

The water distribution system is a complex network of pipes connecting the water treatment plant and the tap in your home. This network of pipes can act as an entry point for opportunistic pathogens into treated water. For this reason, it is important to monitor and optimize not only the treatment of water within the water treatment plant but also throughout the distribution system to ensure the finished water meets required guidelines and standards. Thus, the use of residual disinfectant is important to maintain water quality from treatment to tap. Although this ensures the microbial safety of finished water, it also makes monitoring the chemical safety of the water during distribution far more challenging. DBP concentrations in water distribution systems have exhibited both temporal and spatial variations, often associated with changes in water quality, residual disinfectant, or the physical structure of the distribution system (Fig. 3) (Wang et al. 2014; Zhao et al. 2006). This variation means that measurements taken at the water treatment plant may not be representative of the water that is being used and consumed by end users depending on their location in the distribution system. Nitrosamine concentrations were found to increase with distance from the water treatment plant, while halobenzoquinone (HBQ) DBPs were transformed to hydroxyl-HBQs throughout the distribution system (Wang et al. 2014; Zhao et al. 2006). THMs were also found to be highest at sampling sites further from the treatment plant, the opposite for HAAs which were found to be lowest. Thus, users of the same water treatment plant may be exposed to different DBPs depending on their distance from the plant, complicating the issue of ensuring drinking water with minimal chemical risk.

Fig. 3
figure 3

Challenges in the distribution system

Another important consideration for the distribution system is the growth of biofilm, particularly in aging infrastructure. Biofilms are not only a problem in drinking water distribution systems but also in purified water supply systems used in laboratory and medical facilities. The term biofilm refers to the accumulation of microorganisms on a surface and consists of many aggregated microbial cells within a matrix of biomolecules such as nucleic acids, polysaccharides, or lipids that together are known as extracellular polymeric substances (EPS). As expected, biofilms consume the residual disinfectant in the distribution system. To maintain the biostability of the finished water, utilities typically increase the necessary disinfectant dose. However, increasing the residual disinfectant dose also increases the formation of DBPs in the distribution system. Much of the biomass of biofilms is made up of EPS, which has a similar composition to AOM and EfOM DBP precursors (Bond et al. 2009; Wei et al. 2011; Sutherland 1985). High reactivity between chlorine and biofilm has been reported; however, non-specific reactions limit its penetration into biofilm (Chen and Stewart 1996). DBP formation from biofilm is still not well understood, but is an important consideration (Wei et al. 2011; Wang et al. 2012b, 2013).

Corrosion and the leaching of contaminants from drinking water distribution systems are affected by several factors, including the age of the distribution system, the type of materials used, and the quality and standing time of the water (WHO 2017; Liu et al. 2017). The materials used for distribution systems include two families: metallic and non-metallic materials. Corrosion of metallic pipes, such as galvanized steel, iron, or copper, can cause increased heavy metal (e.g., lead, cadmium, copper) concentrations in drinking water (WHO 2017; Liu et al. 2017). Among these heavy metals, lead is of particular concern because of its toxicity (Kim and Herrera 2010). Though lead water pipes have been banned for use in new construction, water utilities with aging infrastructure continue their utilization, often due to the costs associated with their replacement (Rabin 2008). Lead can leach from the pipe itself, as well as from lead solders and galvanized iron pipe plumbing (Liu et al. 2017). To reduce lead corrosion, the pH within the distribution system can be increased to 8.0–8.5 to directly prevent leaching, or treatment with orthophosphate or other phosphates can inhibit lead release by the formation of insoluble lead phosphate compounds which can also form an additional protective coating on the pipe to prevent further leaching (WHO 2017; Trueman et al. 2018). Non-metallic pipes are usually made of polyvinylchloride (PVC), chlorinated polyvinylchloride (CPVC), or polyethylene (PE). Unlike metallic pipes, non-metallic pipes are corrosion-resistant (Liu et al. 2017). However, PVC water pipes may still leach vinyl chloride and dialkyltins which are used as stabilizers in plastic (WHO 2017).

6 The Complexity of Ensuring Safe Drinking Water

Consistent production of safe drinking water is necessary for public health, and as seen from the previous sections of this chapter, it is not a simple task. Balancing chemical and microbial risks to ensure safe drinking water requires extensive research, optimization, and maintenance. The effects of failing to do so are highlighted in our final case study of Flint, Michigan. The Flint water crisis has received significant media attention that continues as the events are analyzed. This series of events has had a profound effect on the well-being of Flint citizens and highlights the complexity that water utilities face.

6.1 Case Study: Flint, Michigan, USA

In the 1960s, the city of Flint, Michigan switched to the Detroit Water and Sewage Department as the city’s main provider of treated water (Masten and Davies 2017). The switch was made in part to ensure sufficient quantities of drinking water, as source water for Flint’s Water Service Center was supplied by the Flint River, which was found to be difficult to treat. The Detroit water utility, on the other hand, collected water from the more stable and easily treatable source of Lake Huron. In the ensuing years, Flint’s Water Service Center was maintained as a backup producer of drinking water, but its use was limited to only a few times each year. In 2013, as a cost-saving measure, the city of Flint decided to join the newly formed Karegnondi Water Authority which was constructing its own pipeline from Lake Huron. While construction was being completed, city officials decided to restore full-time operation of the city’s Water Service Center, again utilizing source water from the Flint River. Under a tight timeline to complete the transition, inadequate preparation was put into the analysis of the variables and risks associated with such significant changes in infrastructure and source water. As a result, several physical, chemical, and biological water quality issues were soon discovered. This included increased THM concentrations, corrosion of iron pipes leading to breaks, red water, lead leaching from pipes, and elevated bacteria levels (Del Toral 2015; Croft et al. 2015). The structural issues were due to the city’s decision not to use a corrosion inhibitor as well as the use of ferric chloride as a coagulant which exasperated the corrosivity of the water toward the lead pipes (Del Toral 2015; Croft et al. 2015; Pieper et al. 2017). In the end, a public health emergency was declared, as the number of children with elevated blood levels increased by 6.6% in some areas of the city (Hanna-Attisha et al. 2016). These events and the intense negative publicity surrounding the issue forced Flint officials to make the switch back to the Detroit Water and Sewage Department in October 2015. However, lead levels remained elevated, and extra corrosion inhibitor was added in December 2015 (Allen et al. 2017). Even with these changes, many residents were reluctant to use their tap water. Concerns regarding the presence of skin rashes were a purported result of elevated DBP exposure, although this was ultimately proven not to be the case as it was soon found that the water did not have significantly different levels of DBPs compared to surrounding cities (Allen et al. 2017).

Although lead exposure was the driving force behind the intense media scrutiny of the city of Flint, chemical contamination was not the only threat to public health. In the aftermath of the crisis, information was soon released on several cases of Legionnaires’ disease caused by Legionella bacteria. A total of 91 cases were diagnosed and 12 deaths were reported when source water was obtained from the Flint River (Michigan Department of Health and Human Services 2015a, b; Zahran et al. 2018). While the origin of this outbreak has not been identified, it is expected that one of the main drivers for this microbial growth was significant fluctuations in levels of free chlorine residual (Rosen et al. 2017). Residual chlorine is necessary to maintain the safety of the water throughout the distribution system. Many factors are associated with the loss of free chlorine, and it is unknown which factors were affected that led to the growth of Legionella. In addition, the presence of iron and high concentrations of assimilable organic carbon was also expected to have contributed to the biological growth and propagation of Legionella in the distribution system by lowering residual chlorine levels.

The case study of Flint highlights the importance of understanding the variables within a water system. The source water from the Flint River was considered difficult to treat due to high bacterial and carbon content, as well as significant seasonal variation in these chemical and biological parameters. Very few studies, including no pilot study, were performed on the water from the Flint River before the switch to ensure that the water utility could provide safe drinking water with the proposed changes. Although the Flint River is no longer being used to provide drinking water, nearly irreparable damage has been done to the community and to the water system, shattering public trust in water management. Currently, all lead pipes in the Flint distribution system are being replaced at great cost. However, as observed in Madison, Wisconsin, simply replacing the pipes will not eliminate the lead problem immediately, leaving a legacy of poor water quality (Cantor 2006).

7 Perspectives

As water is essential to human life, every effort should be made to achieve drinking water that is as safe as possible. The primary goal of water treatment is to eliminate the microbial risk that poses a direct threat to human health. Nonetheless, there is a trade-off to the reduction of microbial risk. DBPs pose a chemical risk that forces water utilities to take measures to limit the production of DBPs. The complexity and variety of DBPs and their precursors mean that there are further compromises when it comes to reducing the production of certain DBPs. Regulations focus on THMs and HAAs, as these compounds are typically used as a representative measure of the total concentration of chlorinated DBPs. However, it remains unknown which DBPs are the toxicity drivers responsible for the human health effects associated with consumption of treated water (Federal Register 1998; Li and Mitch 2018). The assumption that THMs and HAAs should correlate with toxicity drivers in disinfected water seems reasonable; yet, the focus on THMs and HAAs may overlook the true toxicity drivers and drive water treatment optimization toward more toxic compounds, as observed in the case study of N-nitrosamines in Sect. 4.1. Therefore, collaborative efforts between DBP researchers in diverse fields such as chemistry and toxicology have been established to determine which DBPs are the toxicity drivers. This work has developed a set of quantitative cytotoxicity and genotoxicity data using Chinese hamster ovary (CHO) cells for over 100 regulated and unregulated DBPs (Wagner and Plewa 2017). Most importantly, this work has demonstrated that some unregulated DBPs, particularly N-DBPs as well as Br- and I-DBPs, are orders of magnitude more cytotoxic and genotoxic than many currently regulated DBPs (Wagner and Plewa 2017; Pals et al. 2013; Plewa et al. 2010). Regulations on DBPs have driven the optimization of water treatment, leading to a narrow focus on which DBPs to reduce. By identifying the toxicity drivers, regulations based on risk evidence may be developed to protect human health and would drive the optimization of water treatment to minimize the production of these toxic DBPs.

While animal studies are the gold standard in toxicology, only 24 DBPs have been evaluated for carcinogenicity by in vivo assays, with 22 inducing tumor formation (Richardson et al. 2007). These tests are quite time-consuming and expensive, limiting the number of compounds that can be tested. Therefore, alternative methods are often used to prioritize which of the large number of DBPs (more than 600 identified) should undergo in vivo testing. As mentioned earlier, quantitative in vitro cytotoxicity and genotoxicity assays based on CHO cells has generated a database of over 100 DBPs, which allows for the ranking of their toxicity (Wagner and Plewa 2017). To date, this is the largest in vitro toxicity data set of DBPs. However, CHO cells lack certain metabolic features that may be important for the activation of DBPs to mutagens (Li and Mitch 2018). For example, carcinogenicity of NDMA requires activation by metabolic enzymes (Beranek et al. 1983; Souliotis et al. 2002). Identification of the toxicity mode(s) of action, likely to be associated with specific end points (e.g. bladder cancer), can also help prioritize DBPs for confirmation with in vivo assays (Li and Mitch 2018).

In addition to measuring the toxicological potency of new DBPs, it is important to consider the relative concentration of these DBPs to determine DBP toxicity drivers. Therefore, DBP researchers are beginning to compare measured DBP concentrations weighted by metrics of toxic potency, such as CHO cytotoxicity. This method may provide better information to assess the DBP-associated safety of water as compared to simply looking at the total amount of DBPs (mass basis) in water without considering the toxicological potency (toxicity-weighted basis) of each DBP (Fig. 4). By using a toxicity-weighted measurement, researchers have shown that unregulated halogenated DBP classes, particularly HANs, may be greater contributors to the DBP-associated toxicity in conventional European drinking waters (Plewa et al. 2017), chlorinated or chloraminated high salinity groundwaters (Szczuka et al. 2017), and chloraminated potable reuse effluents (Zeng et al. 2016). When these toxicity-weighted measurements are used, toxicity results from single compound assays are generally assumed to be additive (Zeng et al. 2016; Szczuka et al. 2017). However, previous research has shown that this assumption is not always valid (Boorman et al. 1999; Narotsky et al. 2015). More research is needed to determine the potential for DBPs in mixtures to exhibit synergistic or antagonistic interactions.

Fig. 4
figure 4

Used with permission, further usage requires ACS permission. https://pubs.acs.org/doi/10.1021/acs.est.7b05440. (Li and Mitch 2018)

Evaluation of two water samples (Water 1 and Water 2) using the conventional mass basis approach (left) and emerging toxicity-weighted basis approach (right), illustrating how the incorporation of toxicological potency can strongly influence the assessment of water safety.

In response to this new information, there is a need for water treatment practices to shift toward reducing toxicity drivers and not simply the regulated DBPs. As was discussed previously, efforts to reduce the formation of regulated THMs and HAAs at some utilities had implications for the formation of unregulated DBPs that are potentially a greater risk to human health. Therefore, in addition to the trade-off between chemical and microbial risks, water utilities must also make difficult decisions that trade-off between different chemical risks. These decisions do not only apply to the details regarding the water treatment system and disinfectant choice but also to the selection of source water and decisions on infrastructure spending. Since each water treatment system is unique, the fundamental rule that governs these decisions is “know your system”. As discussed in the preceding case studies, there are many variables to water treatment, and therefore it is important to understand the importance of each and continuously monitor changes. DBPs are only one of the hazards to consider when talking about water quality management. Water utilities should seek to minimize DBP exposure to consumers; however, this should never come at the expense of effective disinfection to keep the microbial risk of untreated water at a minimum.

In certain situations, it may be more practical to lower DBP guidelines in order to ensure that microbial-free water is delivered to consumers. These types of decisions are potentially short-term, in response to emergencies and disasters, or the initial step to a long-term solution in areas with a lack of infrastructure and/or quality water sources. Climate change is expected to affect the number of and intensity of extreme water-related weather events (Levy et al. 2017). Heavy rainfall events can transport pathogens in the environment and can increase run-off into water sources, thereby increasing potential human exposure to waterborne pathogens. Additionally, water treatment plants may be overwhelmed, leading to contamination of drinking water pipes, overflow of sewage, or untreated waste being dumped into waterways. Changes in global climate can also reduce the availability of clean drinking water sources. Higher peak temperatures can lead to increased frequency of drought as well as sea level rise leading to salination of groundwater sources. All of these changes to water treatment inputs require optimization and improvements in order to ensure safe drinking water. More extreme weather events will lead to an increased frequency of natural disasters and emergency scenarios. In these situations, the availability of clean drinking water is limited. People may move to an area with contaminated drinking water, or when sanitation is inadequate, it is highly likely that unprotected water sources around temporary settlements will become contaminated. Therefore, drinking water quality standards addressing factors such as odor, color, and DBPs should be flexible in order to prevent excessive restrictions on water use which can lead to more critical health effects.

Water treatment is necessary to provide safe drinking water that is free from pathogenic microorganisms. This chapter began with the story of Walkerton, highlighting the constant threat of waterborne disease and the need for constant monitoring and improvement from water treatment processes. In addition to the threat of pathogens, water utilities must also manage the threat of DBPs, an important chemical risk. To ensure safe drinking water, a delicate balance exists between the two. Every aspect of the water treatment process from source to tap has an impact on this trade-off between microbial and chemical risk. Changes to our environment driven by human activity have led to the use of alternative water sources that can contain differing sets of DBPs precursors. Owing to concerns over the adverse chronic health effects of chlorinated water, countries have regulated the maximum contaminant levels of DBPs. This in turn has driven the optimization of water treatment to reduce the concentration of these compounds, but has led to a trade-off between different chemical risks. Water utilities must also consider the distribution system as a potential source for contamination and reactor for transformation. All these aspects come together to form a complex system with many variables that interact. DBP researchers are working to discover the toxicity drivers so that these systems can be optimized such that both the microbial and chemical risks are minimized.