Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 Introduction

The 2007 Earth Science Decadal Survey (NRC 2007), in its vision for the future of Earth science and applications, specifically called for “a vision that includes advances in fundamental understanding of the Earth system and increased application of this understanding to serve the nation and the people of the world”. The survey goes on to emphasize the role of remote sensing for improving air quality and health conditions: “Further improvements in the application of remote sensing technologies will allow better understanding of disease risk and prediction of disease outbreaks, more rapid detection of environmental changes that affect human health, identification of spatial variability in environmental health risk, targeted interventions to reduce vulnerability to health risks, and enhanced knowledge of human health-environment interactions.” Finally, the Survey issues the challenge facing the Earth science community: “Addressing these societal challenges requires that we confront key scientific questions related to … transcontinental air pollution … impacts of climate change on human health, and the occurrence of extreme events, such as severe storms, heat waves, …”

Recognizing the great potential of remote sensing assets for advancing knowledge and applications in air quality and health, in 2001 NASA’s Applied Sciences Program defined health and air quality (HAQ) as one of its applications areas to support the use of Earth observations, particularly regarding infectious and vector-borne diseases and environmental health issues. The area also addresses issues of toxic and pathogenic exposure and health-related hazards and their effects for risk characterization and mitigation. HAQ promotes uses of Earth observing data and models regarding implementation of air quality standards, policy, and regulations for economic and human welfare. The area also addresses effects of climate change on public health and air quality to support managers and policy makers in planning and preparations.

Specifically within air quality the key question being addressed by HAQ is:

How will continuing economic development affect the production of air pollutants, and how will these pollutants be transported across oceans and continents?

The HAQ area developed the Air Quality Applied Sciences Team (AQAST) to serve the needs of US air quality management (pollution monitoring and forecasting, quantifying emissions, understanding and monitoring pollutant transport, and understanding climate-air quality interactions) through the use of Earth science satellite observations, models, and latest scientific knowledge. AQAST members complete long-term applications projects but also support short term, quick response projects through ‘Tiger Teams’.

The range of Earth observing systems used for air quality applications involves Terra/Aqua Moderate resolution Imaging Spectroradiometer (MODIS: land surface); Landsat (land surface); Tropical Rainfall Measuring Mission/Global Precipitation Measurement (TRMM/GPM: Precipitation); Soil Moisture Active/Passive and Soil Moisture and Ocean Salinity (SMAP/SMOS: soil moisture); and Aura (ozone, air quality and climate). While Centers for Disease Control and Prevention (CDC) represents the key stakeholder agency of the HAQ program, interactions also occur with National Oceanic and Atmospheric Administration (NOAA), Environmental Protection Agency (EPA), Department of Defense (DoD), US Global Change Research Program, nonprofit and for-profit sectors and GEO Health and Environment Community of Practice.

The major challenges related to the use of remote sensing data for HAQ applications include:

  • Determining types and vertical distribution of aerosols and particulates;

  • Lack of data due to clouds when using optical data;

  • Determining human exposure to pollutants at fine spatial scales;

  • Availability of real-time air quality and other environmental data for use in warning systems.

The focus of the health program is on understanding the relationships between climate change and human health and on improving predictability of health epidemics. Specific areas of concern are how the following will change with climate:

  • Heat-related illnesses and deaths, including cardiovascular disease

  • Health impacts of extreme weather events (injuries, mental health issues)

  • Air pollution-related health effects (asthma and cardiovascular disease)

  • Water- and food-borne diseases and illnesses

  • Vector- and rodent-borne diseases and illnesses (malaria, dengue, chikungunya, West Nile, etc.)

1.1 Future Prospects

After several decades of research and development to create the capabilities to control diseases and monitor air quality using remote sensing technologies, the pieces are falling into place to support global implementation of such technologies. Comprehensive and integrated early warning systems are rapidly improving and are available to minimize the impact of deadly diseases and environmental hazards such as floods and extreme heat (Hossain 2015), and the barriers to implementation, namely cost and data management capabilities, are being torn down. Data and good intentions alone, however, are not sufficient. Capacity-building efforts are needed in many countries to improve technology transfer, and to better structure national information systems and decision-making processes, if these nations are to derive full benefit from this powerful technology.

In spite of these major advances, the global scientific community is only beginning to scratch the surface of potential societal applications of remote sensing data. Based on current plans of NASA and other national space agencies, Earth observation data will be abundant in the coming decades, and much of the data will be well suited for applications in health and air quality. In order to bring the benefits of these data to society, a well-designed strategy for effective use of these data to monitor, forecast, and warn the public is critical. This strategy must engage multiple factions within the Earth science community—remote sensing scientists for developing useful products, developers to create analysis and visualization tools, and social scientists, educators and public officials to devise effective messaging to convey critical information to the public in a way that promotes action.

There are several existing sensors and missions that are well-suited to provide niche data sets for health and air quality applications, such as:

Even more useful inputs to HAQ monitoring and forecasting applications will be provided by sensors that are still in the pre-formulation phase, including:

Other satellite and sub-orbital sensors, not yet to the pre-formulation phase, may be key elements of a future (~2030) HAQ system. For example, nanosatellites, including CubeSats (http://www.nasa.gov/mission_pages/cubesats/index.html) flying in constellations may play an important role in a future HAQ observational system. Tropospheric and stratospheric environmental observations from small unmanned aerial vehicles may be commonplace within this time frame. Also, crowd sourced environmental measurements and health outcomes, the ‘complementary system of observations of human activities’ recommended in the Decadal Survey (NRC 2007), may be extremely useful for calibration and validation of HAQ models, but the path to bring these into the mainstream is not currently clear.

2 Case Studies

2.1 Wildfires and Public Health

With as much as 40 % of terrestrial carbon in the U.S. being stored in forests (Pacala et al. 2007), sequestration of greenhouse gasses (GHGs) in terrestrial biomass has become an important element in strategies aimed at mitigating global climate change. Carbon dioxide emissions from wildfire in the U.S. have been estimated to be equivalent to approximately 4–6 % of anthropogenic sources at the continental scale (Wiedinmyer and Neff 2007).

The Western U.S. has millions of acres of overstocked forestlands at risk of large, uncharacteristically severe wildfire. There are a variety of factors contributing to this risk, including human-induced changes from nearly a century of timber harvest, grazing, and particularly, fire exclusion (North et al. 2015; Miller et al. 2009). For instance, the mid-elevation conifer forests of the Sierra Nevada, California, contain vast areas of high density, relatively homogenous, second growth coniferous forests that are increasingly prone to high-severity fires (Miller et al. 2009). Historic factors contributing to these conditions include the elimination of burning by Native Americans during the mid- to late ninetenth century (Anderson 2005), removal of large trees through the early twentieth century via railroad and selective logging (Stephens 2000), a nearly 100-year policy of fire exclusion (North et al. 2015; Stephens and Ruth 2005), and extensive use of clear-cut harvesting and overstory removals on public lands through the 1980s (Hirt 1996).

State and federal agencies have been very effective at suppressing the vast majority of wildfire ignitions (Dombeck et al. 2004), but the total area burned in the U.S. has increased in recent decades (Westerling et al. 2014). The US Forest Service (USFS) estimates that an average of more than 73,000 wildfires burn about 7.3 million acres each year of private, state, and federal land and more than 2,600 structures (USFS 2015) nationwide. Though the number of wildland fires decreased since 1960, the total acreage burned each year is trending upwards. Large fires covering over 100,000 acres are more frequent and cover more acreage now than in the 1990s (NIFC 2015). Trends of both increased fire sizes and uncharacteristically severe burning have been demonstrated throughout the western U.S. (Miller et al. 2009) and increasing fire sizes are expected to continue under changing climates (Westerling et al. 2006). For decades, scientists and managers have understood that the threat fire would pose to forests in this condition (Biswell 1959, 1989). In the western U.S., however, it was not until the early 1990s that federal land management agencies were given direction to manipulate stands, with the specific objective of modifying landscape level fire behavior.

Fire is a natural disturbance process that has an important role in determining forest species composition, structure, and stand development pathways. Fire has been a regular occurrence in these forests for millennia and continues to be today, despite the changes in spatial and temporal patterns over the last century (Biswell 1959; Kilgore and Taylor 1979; Collins et al. 2011). Most low to moderate intensity fires in western U.S. forests historically included some patches of high-severity fire (Arno et al. 2000; Fulé et al. 2003; Beaty and Taylor 2008; Perry et al. 2011). Current wildfire high-severity patch sizes and areas in some forests that once burned frequently with low-moderate intensity fire regimes are well outside historical conditions (Miller et al. 2009).

Fire also drives large-scale transformations that can impede the ability of forests to deliver ecosystem services such as carbon sequestration (Millar and Stephenson 2015). In western U.S. forests, which include some of the globally highest densities of carbon storage, fires can drive the significant loss of carbon stocks from forests to the atmosphere (Gonzalez et al. 2015). High-severity fires also impair the future sequestration of carbon from likely shifts in ecosystem composition, shifting ecosystems from forest to grasslands and shrublands (Savage and Mast 2005; Roccaforte et al. 2012).

The purpose of this western US-based case study is to describe the significance and consequences of high-severity wildfires in terms of GHG emissions and air quality impacts. This chapter describes the current state of science in terms of management options as well as impediments toward lowering GHG emissions from wildfires. We also present a carbon offset methodological framework that accounts for GHG emission reductions from these treatments that could be used to reduce overall treatment costs.

2.1.1 Wildfire Emissions and Air Quality

Fire converts carbon stored in the forest floor materials and live and dead vegetation to atmospheric CO2, CO, CH4, other gasses, and particulate matter (Stephens et al. 2007). These other emissions include nitrogen oxides (NOx), NH3, SO2, and fine particulates (Urbanski 2014). Along with the protection of people, property, and atmospheric GHG emissions, concerns of local air quality impacts from smoke have become another important goal of land management (Schweizer and Cisneros 2014).

Wildfires can have a significant adverse impact on regional air quality, depending on fuel load, terrain, and atmospheric conditions. During recent major wildfires, ambient air levels of particulate matter (PM2.5) have been measured at over 10 times that of the U.S. EPA National Ambient Air Quality Standard (NAAQS) (MDEQ 2007; CARB 2009). Wildfires have also been directly responsible for exceedances of the ground level ozone NAAQS through generation of NOx and volatile organic compounds (VOC) precursors (SMAQMD 2011).

Air pollutants from wildfires are a well-established threat to public health, particularly increasing the risk for lung and heart disease and respiratory infections. Exposure to air pollutant emissions from wildfires has been definitively linked to an increase in local hospital admissions, particularly children from ages 1–4 and the elderly over 65, for respiratory related conditions including eye, nose and throat irritation, chest pain, and asthma, bronchitis, pneumonia, and chronic obstructive pulmonary disease (Delfino et al. 2009).

2.1.2 Effects of Fuel Treatments

2.1.2.1 Effectiveness of Fuel Treatments

It is virtually impossible to exclude fire from most fire-prone landscapes, such as those found across the western U.S., over long periods of time (Reinhardt et al. 2008). During extreme weather conditions, suppression efforts can become overwhelmed and fires can quickly grow to cover very large areas. Suppression efforts can be rendered ineffective under less than extreme weather conditions when fuel and forest structure conditions result in relatively homogenous forests prone to high-intensity burning. This may be particularly true as the effects of climate change are manifested on fire severity trends (Millar et al. 2007; Miller et al. 2009). ‘‘No treatment’’ or ‘‘passive management’’ (Agee 2002; Stephens and Ruth 2005) perpetuates the potential for exacerbated fire behavior in forests (Moghaddas et al. 2010).

Modification of fuel structures and reduction of unnaturally high fuel loads in order to alter fire patterns and behavior are a primary component of planning efforts such as the National Cohesive Wildland Fire Management Strategy (Wildland Fire Leadership Council 2011) and the Sierra Nevada Forest Plan Amendment (USFS 2004). Various methods for fuel modification, collectively termed “fuel treatments,” include shredding of understory biomass (mastication), removal of sub-merchantable small diameter trees and understory biomass (e.g., thinning from below), pre-commercial and commercial timber harvest (e.g., whole tree removal), and prescribed fire to remove surface fuels (shrub, grass, and down woody debris) and trees with low branches (ladder fuels). These treatments reduce or alter fire behavior, spatial patterns, effects on ecosystems and GHG emissions mainly by reducing the potential for crown fires and therefore fire severity (Moghaddas and Craggs 2007; Stephens et al. 2009a, b, c, 2012a; Moghaddas et al. 2010). These studies document treatment effects on fire behavior across several treatment types and provide guidance on designing treatments for forest stands.

Efficacy and effects of fuel treatments in real world situations has been demonstrated in real wildfire conditions (Moghaddas and Craggs 2007; Safford et al. 2009), but the majority of scientific evidence for their use comes from modeling efforts (Stephens and Moghaddas 2005; Stephens et al. 2009a; Moghaddas et al. 2010; Collins et al. 2011). Overall, there is clear consensus in the published literature that fuel treatments, specifically those that incorporate thinning from below and treat surface fuels with prescribed fire, reduce potential fire severity under a range of moderate to extreme weather conditions.

If climate change impacts continue even under the most conservative projections, it is likely that coniferous forests in the Sierra Nevada will experience longer fire seasons (Westerling et al. 2006) which are relatively drier and more conducive to high-intensity fire (Miller et al. 2009). To reduce this hazard, there is not one fuel treatment strategy. Rather, a combination of strategies is needed, especially when dealing with complex landscapes and management objectives.

Though now recognized as an important tool for fire protection and ecosystem process restoration, detailed strategies for application of the various techniques in different vegetation types at a landscape scale are still under study (Collins et al. 2010, 2011). Where the management of naturally ignited wildfires for resource benefit is a viable management option, further research is needed to support the integration of fuel treatments and beneficial fire use as a fuel management strategy. At a landscape scale, this is a difficult proposal and will likely challenge many of the current paradigms within management, policy, and regulatory frameworks (Germain et al. 2001; Collins et al. 2010).

2.1.2.2 Carbon Consequences of Fuel Treatments

Concerns over global climate change seemingly place fuel treatments that reduce near-term forest carbon stocks at odds with long-term carbon sequestration objectives in terrestrial vegetation as a means of climate change mitigation. Though wildfires also combust biomass and can be a significant source of atmospheric carbon emissions in the near-term (Randerson et al. 2006; Ager et al. 2010), they may also act as mechanisms for long-term carbon sequestration in some systems.

Several recent studies have investigated whether the potential future GHG emissions avoided through fuel treatment can offset immediate losses of stored carbon and carbon emitted during operations, and even possibly result in net positive carbon storage over longer time periods (Hurteau and North 2009, 2010; North et al. 2009; Stephens et al. 2009b, 2012a; Ager et al. 2010; Reinhardt and Holsinger 2010; Campbell et al. 2011; North and Hurteau 2011; Stephens et al. 2012a, b; Saah et al. 2012, 2015).

Carbon stored in forests is reduced in the short term through fuel treatments that are designed to reduce fire severity and smoke emissions. Other short term emissions include fossil fuel emissions from machinery used during treatments and processing of biomass and emissions from prescribed fires. While Campbell et al. (2011) found that more carbon is lost to treatments than what would be spared from loss by wildfire, particularly if fire is infrequent, others have concluded that in the long term, fuel treatments may result in overall increases in stored carbon, for example through reduced fire effects, carbon sequestration in large, fire resistant trees (Stephens et al. 2009a, b, c; Hurteau and North 2010), and in the form of durable wood products which might replace other fossil fuel intensive products (product substitution). Whether residues from fuel treatments are used to generate electricity or if it is burned on site in piles can also greatly affect GHG benefits as well as air pollutant balances (Jones et al. 2010; Lee et al. 2010; Springsteen et al. 2011, 2015).

Some carbon pools, such as soil organic matter, are relatively unchanged by wildfire regardless of the fire’s severity. In a nationwide study of fire and fire surrogate treatment effects on carbon storage, Boerner et al. (2008) found the network-wide effects of these treatments on soil carbon modest and transient. For above ground carbon pools, Hurteau et al. (2008) pointed out that thinning can be thought of as increasing ‘rotation length’ by moving more forest carbon into longer residence-time storage. Thinning in Sierra Nevada mixed conifer leads to carbon storage in fewer, but larger, trees which are more representative of pre-settlement forest conditions. Hurteau and North (2009) concluded that the 1865 reconstruction stand structure, in which current stand density was reduced while large, fire resistant pines were retained, may be the best stand structure for achieving high carbon storage while minimizing potential wildfire emissions in fire-prone forests.

Whether or not fuels treatments safeguard enough carbon to offset their carbon cost depends on many factors including initial forest structure and carbon stocks, existing fuel loads, expected wildfire frequency and severity, fuel treatment type and intensity, and the fate of merchantable forest products. In general, overstory thinning plus prescribed fire removes more carbon than other common fuel treatment types while prescribed fire only or understory thinning only removes the least.

2.1.3 Carbon Offset Protocol for Avoided Wildfire Emissions

For western US forests, the paradox around carbon sequestration is that policies are in place that encourages carbon sequestration through afforestation, reforestation, and other silvicultural practices (CAR 2012). In contrast, the treatments that reduce carbon loss to wildfire are poorly acknowledged by current climate-sensitive policy. This is especially relevant when considering that long-term wildfire carbon emissions can be three times the size of their direct carbon emissions during the fire itself (Stephens et al. 2009a, b, c).

As market-based approaches to mitigating global climate change are being considered and implemented, one important emerging strategy for changing the economics of fuels treatments is to sell carbon emission offsets. Offsets are tradable certificates or permits representing the right to emit a designated amount of carbon dioxide or other GHGs. These offsets are generated when projects or actions reduce GHG emissions beyond what is required by permits and rules, and can be traded, leased, banked for future use, or sold to other entities that need to provide emission offsets (Sedjo and Marland 2003). In the case of fuel treatments, carbon emission offsets can theoretically be generated by projects that reduce potential emissions from wildfire, as by modifying the probability of extreme fire behavior for a given portion of land. In 2006, the California legislature enacted Assembly Bill 32: The Global Warming Solutions Act (AB32), setting emissions goals for 2020 and directing the Air Resources Board to develop reduction measures to meet targets (State of California 2006). Forest management (including fuel treatments) is one area that has been targeted for project-based offset development. The EPA and agencies implementing AB32 require that carbon emission offsets be quantifiable, real, permanent, enforceable, verifiable, and surplus.

Development of carbon emission offsets as an effective tool for forest and fire mangers, therefore, requires an integrated approach that considers wildfire probabilities and expected emissions, as well as net expected carbon sequestration or loss over time.

Fuels treatments involve tradeoffs between reducing the risk of carbon loss due to wildfires and increasing carbon emissions due to the fuels treatment themselves. Typically fuel treatments increase carbon emissions in the short term but in specific contexts can reduce long-term carbon emissions when wildfires are considered. A key issue is the probability of fire occurring after treatment implementation. Treatments that are not impacted by wildfire do not mitigate potential emissions, and can therefore be carbon sources as opposed to sinks (Campbell et al. 2011).

The Climate Action Reserve’s Forest Project Protocol Version 3.3 (CAR 2012) provides forest owners with a platform to market carbon stored in their forest. By acknowledging the mitigation potential of wildfire-related carbon losses through fuel reduction treatments (Hurteau and North 2010), this protocol does not classify fuel reduction treatments as an immediate emission (Hurteau and North 2009). However, a clear framework that accounts for the net emission savings and endorsed by leading carbon market platforms such as the California Air Resources Board is missing to date. Necessary steps toward development of such a protocol are under way (e.g., Saah et al. 2012). Methodological development will require a consensus on how to approach the following framework elements: (1) delineate and characterize appropriate spatial scales; (2) define acceptable treatment and no treatment (baseline) forest management practices; (3) identification of appropriate forest growth and yield models; (4) definition of forest biomass removals life cycle assessment boundaries; (5) defining mechanisms for determining fuel treatment effectiveness and longevity; (6) quantification methods for determining direct wildfire emissions and indirect treatment effects of fuel treatments on fire within the greater landscape; (7) parameters for determining the probability of fire ignition and fire return interval; (8) standards for quantifying the risk of vegetation conversion after high-severity fires (e.g., forest converted permanently to brush land); and, (9) acceptable practice for prorating total short and long-term emissions.

2.2 Water-Borne Diseases

The World Health Organization (WHO) reports that over 3.5 million people die annually because of various water-related diseases (WHO/UNICEF 2014). Over a billion people in the world still lack access to safe water, and over two billion lack sufficient sanitation facilities. Due to the insufficient safe water and sanitation access in many parts of the world, and lack of knowledge and understanding of the microscale and macroscale processes and transmission pathways of many water-related diseases, the burden of water-borne diseases remains unacceptably high (Akanda et al. 2014).

As the world is rapidly urbanizing, most of the world’s emerging megacities will be situated in coastal areas of Asia and Africa (Akanda and Hossain 2012). Many of these regions will experience significant shifts in regional climate patterns and coastal sea level rise. As a result, the relationship between climate, water, and health will become more intimate, existing vulnerabilities will worsen, and natural disasters such as droughts and floods will expose millions to displacement, unsafe living conditions, and water-borne diseases. There is great potential to channel the efforts and skillsets of the satellite remote sensing community for public health benefits to reduce the burden from water-related diseases by providing timely and efficient prediction of water-related disasters, assessment of population vulnerability in regions at risk, and strengthening prevention efforts with operational early warning systems, as well as, education and outreach.

For example, the ongoing seventh pandemic of cholera started in the 1960s and has affected over seven million people in over 50 countries. The WHO estimates that cholera affects over three million people and causes 250–300 thousand deaths every year. Despite the continuous progress on the research on the causative pathogen V. cholerae and the development of vaccines, there is limited understanding of the causative pathways and the role of large-scale physical processes and climate phenomena in propagating outbreaks across regions (Akanda et al. 2014). This case study analyzes how satellite remote sensing can be used in detecting appropriate large-scale processes behind endemic and epidemic cholera outbreaks in South Asia, where combination of droughts and floods and coastal ecological conditions create a favorable transmission environment.

The timing of seasonal cholera outbreaks can be anticipated reasonably well in endemic settings such as the Bengal delta region of South Asia or in the Lakes region in Sub-Saharan Africa (Jutla et al. 2015a, b). However, the potential magnitude or the location of outbreaks remains hard to anticipate, and the preemptive positioning of the appropriate level of human and material resources in anticipation of outbreaks have been especially difficult. Predictions based on the larger scale processes that have sufficient system memory can add significant ‘lead-time’ ahead of impending outbreaks for preemptive preparations. Such predictions can be valuable for endemic regions by adding the potential timing, intensity, and location of outbreaks; and especially for epidemic regions by identifying regions at risk of potential outbreaks and assessing the population vulnerability to cope with such disasters (Akanda et al. 2012).

The bacterium V. cholerae can survive, multiply, and proliferate in favorable aquatic and brackish estuarine environments (Colwell and Huq 2001). Primary outbreaks of cholera have been reported in coastal areas of South Asia, Africa, and South America over the last several decades (Huq and Colwell 1996; Griffith et al. 2006). Ever since the first correlative study relating cholera incidence with increased algae in the waters of rural South Asia, various studies have postulated connections between coastal cholera outbreaks and plankton abundance in the marine environment (Cockburn and Cassanos 1960; Huq and Colwell 1996).

The endemic and epidemic outbreaks in South Asia have been linked to a number of environmental and climatic variables, such as precipitation (Longini et al. 2002; Pascual et al. 2002; Hashizume et al. 2008), coastal phytoplankton abundance (Lobitz and Colwell 2000), floods (Koelle et al. 2005), water temperature (Colwell 1996), peak river level (Schwartz et al. 2006), and sea surface temperature (Cash et al. 2008). The link with environmental factors have also been studied for other cholera affected regions in the world, such as Southeast Asia (Emch et al. 2008), Sub-Saharan Africa (Hashizume et al. 2008), southern Africa (Bertuzzo et al. 2008), and South America (Gil et al. 2004).

Cholera incidence in the Bengal Delta region shows distinct seasonal and spatial variations in shape of single annual and biannual peaks, in sharp contrast to the typically seen single peaks in other regions (Akanda et al. 2009). The outbreaks in this region typically propagate from the coastal areas in spring to the inland areas in fall aided by the monsoon season in between. In sum, the transmission process exhibits two distinctly different transmission cycles, pre-monsoon and post-monsoon, influenced by coastal and terrestrial hydroclimatic processes, respectively, revealing a strong association of the space time variability of incidence peaks with seasonal processes and extreme hydroclimatic events (Akanda et al. 2011).

Application of remote sensing to study cholera dynamics has been an emerging research area with the availability of longer and more accurate datasets over the last few decades (Jutla et al. 2015b). A principal motivation for using satellite remote sensing in monitoring the environmental conditions conducive for the cholera pathogen stemmed from the fact that it shows strong association with coastal phytoplankton (Colwell 1996). Although, chlorophyll variations on a daily scale exhibited random variability with very limited memory (Uz and Yoder 2004), earlier studies by Lobitz and Colwell (2000) showed promise in monitoring the coastal growth environment for the bacteria with the help of chlorophyll measurements in coastal Bay of Bengal. In this endemic region, the seasonal freshwater scarcity from upstream regions in the flat deltaic landscape provides a vast growth environment during the dry spring months, which can be monitored by studying the space–time variability of chlorophyll in northern Bay of Bengal and thus linking coastal processes to predicting cholera outbreaks in the region (Akanda et al. 2013).

Jutla et al. (2012) quantified the space–time distribution of chlorophyll in the coastal Bay of Bengal region using data from SeaWiFS (Sea-Viewing Wide Field-of-View Sensor) using 10 years of data (2000–2009). A key finding of the study was that while the variability of chlorophyll at daily scale resembled white noise, chlorophyll values showed distinct seasonality at monthly scales and with increased spatial averaging. The first cholera outbreaks near the coastal areas in spring were found to be correlated with the northward movement of the plankton-rich seawater and increased salinity of the estuarine environment favoring increased growth and abundance of the cholera bacteria in river corridors. In a follow-up study, Jutla et al. (2013) showed that seasonal endemic cholera outbreaks in the Bengal Delta region can be predicted up to three months in advance with a prediction accuracy of over 75 % by using satellite-derived chlorophyll (as a surrogate for coastal plankton and pathogen growth) and air temperature (as a surrogate for upstream snowmelt and freshwater flow) data. A high prediction accuracy was achievable because the two seasonal peaks of cholera in spring and fall were predicted using two separate models representing distinctive seasonal environmental processes. The inter-annual variability of pre-monsoon cholera outbreaks in spring were satisfactorily linked with coastal plankton blooms, and the post-monsoon cholera outbreaks in fall were related to breakdown of sanitary conditions due to monsoon flooding and subsequent inundation in floodplain regions (Akanda et al. 2013; Jutla et al. 2013).

With more than a decade of terrestrial water storage data from the Gravity Recovery and Climate Experiment (GRACE) mission, conditions have recently emerged for predicting cholera occurrence with increased lead-times. The lead–lag relationships between terrestrial water storage in the Ganges–Brahmaputra–Meghna basin of South Asia and endemic cholera in Bangladesh were investigated in a study by Jutla et al. (2015a). Availability of data on water scarcity and abundance in large river basins, a prerequisite for developing cholera forecasting systems, are difficult to obtain. The ubiquitous use of river water for irrigation, sanitation, and consumption purposes in this heavily populated basin region exposes the riverine societies to the infectious disease (Akanda et al. 2013). Open mixing of water bodies and channels with V. cholerae leads to transmission through breakdown in sanitation in heavy monsoon rainfall, and inundated areas are enriched with bacteria already present in the ecosystem. According to Jutla et al. (2015a); water availability showed a strong asymmetrical association with cholera prevalence in spring (τ = − 0.53; P < 0.001) and in autumn (τ = 0.45; P < 0.001) up to 6 months in advance. The study concluded that one unit (centimeter of water) decrease in water availability in the basin increased the odds of spring cholera prevalence above normal by 24 %, while an increase in regional water by one unit due to flooding increased odds of above average fall cholera by 29 %.

Satellite remote sensing data were also used successfully to capture the changes in hydroclimatic conditions related to the infamous cholera outbreak of 2008 in Zimbabwe. The first cases of cholera in Zimbabwe were reported to the World Health Organization in August 2008; but between then and June 2009, a massive outbreak erupted with a total of 98,522 cholera cases and 4,282 deaths (WHO/UNICEF 2009). The case fatality ratio in this epidemic, i.e., the ratio of deaths to total cases reported as a measure of the intensity of the epidemic, were found to be much higher than those reported in other countries or areas where appropriate treatment was available. In a recent study by Jutla et al. (2015b), large-scale hydroclimatic processes estimated using TRMM-based precipitation and gridded air temperature were linked with epidemiological data to assess risk of disease occurrence in a retrospective manner.

Although the precise location of the region of a disease outbreak could not be determined with the existing resolution of the TRMM data, a provincial analysis approach (averaging of all pixels in a particular province) showed strong correlation of precipitation and air temperature (Jutla et al. 2015b). The study postulates that if anomalous temperature in a vulnerable region is followed by heavy precipitation, the risk of a cholera outbreak increases significantly, especially in areas where the drinking water source and sanitation infrastructure are poorly maintained or unavailable. The empirical observations and relationships between satellite sensed precipitation and temperature for the 2008 Zimbabwe cholera outbreaks were expanded for five other countries (Mozambique, South Sudan, Rwanda, Central African Republic, and Cameroon) in the Sub-Saharan Africa region. The study showed that anomalous above average air temperatures, followed by above average precipitation at least 1 month in advance were correlated and statistically significant with the first reporting of cholera outbreaks in all sites.

The results demonstrate that satellite data measurements of climatic and environmental variables such as precipitation, temperature, terrestrial water storage, and coastal chlorophyll can be meaningful predictors over a range of space and time scales and can be effective in monitoring ecological conditions conducive for the pathogen V. cholerae and developing a cholera prediction model with several months lead time. Such an approach also shows the potential of newer missions with the increased ability to monitor hydrological and environmental conditions such as river elevation from Surface Water and Ocean Topography (SWOT) and soil moisture from SMAP. The findings and understandings derived from South Asia and Sub-Saharan Africa may serve as the basis for the development of “climate informed” early warning systems for preempting epidemic cholera outbreaks in vulnerable regions and prompting effective means for intervention in vulnerable regions.

2.3 Vector-Borne Diseases

Epidemics of vector-borne diseases still cause millions of deaths every year. Malaria remains a major global health problem, with an estimated three billion people at risk of infection in over 109 countries, 250 million cases annually and one million deaths. Malaria is highly sensitive to climate variations, thus climate information can either be used as a resource, for example in the development of early warning systems (DaSilva et al. 2004), or must be accounted for when estimating the impact of interventions (Aregawi et al. 2014).

In Ethiopia, the determinants of malaria transmission are diverse and localized (Yeshiwondim et al. 2009), but altitude (linked to temperature) is a major limiting factor in the highland plateau region, and rainfall in the semiarid areas. A devastating epidemic caused by unusual weather conditions was documented in 1958, affecting most of the central highlands between 1600 and 2150 m elevation with an estimated three million cases and 150,000 deaths (Fontaine et al. 1961). Subsequently, cyclic epidemics of various dimensions have been reported from other highland areas, with intervals of approximately 5–8 years with the last such epidemic occurring in 2003. Most of these epidemics have been attributed to climatic abnormalities, sometimes associated with El Nino, although other factors such as land-use change may also be important.

Endemic regions of Visceral leishmaniasis (VL) exist within East Africa with a geographic hotspot in the northern states of South Sudan. This region experiences seasonal fluctuations in cases that typically peak during the months of September through January (SONDJ) (WHO 2013; Gerstl et al. 2006; Seaman et al. 1996). In the northern states of South Sudan alone, VL epidemics have recently been observed with a reported 28,512 new cases from 2009 to 2012 (Ministry of Health—Republic of South Sudan 2013). Without proper treatment, mortality in South Sudan is high, with numbers approaching 100,000 during one multi-year epidemic in the late 1980s and early 1990s (Seaman et al. 1996).

The habitat of the VL vector P. orientalis is determined by specific ecological conditions including the presence of specific soils, woodlands, and mean maximum daily temperature (Thomson et al. 1999; Elnaiem et al. 1999; Elnaiem 2011). Research has also documented associations between environmental factors and VL that may contribute to outbreaks of the disease including: relative humidity (Salomon et al. 2012; Elnaiem et al. 1997), precipitation (Gebre-Michael et al. 2004; Hoogstraal and Heyneman 1969), and normalized difference vegetation index (NDVI) (Elnaiem et al. 1997; Rajesh and Sanjay 2013). Additionally, Ashford and Thomson (1991) suggested the possibility of a connection between a prolonged flooding event in the 1960s and the corresponding 10-year drop in VL within the northern states of South Sudan. Although the importance of environmental variables in relation to the transmission dynamics of VL has been established, the lack of in situ data within the study region has led to inconclusive results regarding these relationships. By exploiting the advantages of sustained and controlled Earth monitoring via NASA’s Earth observations, the relationship between environmental factors and the spatiotemporal distribution of VL in the northern states has been shown (Sweeney et al. 2014), therefore showing how Earth observations can be used for mapping risks of leishmaniasis.

Recently, methodology has been developed which use remote sensing to monitor climate variability, environmental conditions, and their impacts on the dynamics of infectious diseases. These methods and results are described in the following sections.

2.3.1 Climate and Environmental Factors: How Do They Help?

To date much of the debate has centered on attribution of past changes in disease rates to climate change, and the use of scenario-based models to project future changes in risk for specific diseases. While these can give useful indications, the unavoidable uncertainty in such analyses, and contingency on other socioeconomic and public health determinants in the past or future, limit their utility as decision support tools. For operational health agencies, the most pressing need is the strengthening of current disease control efforts to bring down current disease rates and manage short-term climate risks, which will, in turn, increase resilience to long-term climate change. The WHO and partner agencies are working through a range of programs to (i) ensure political support and financial investment in preventive and curative interventions to reduce current disease burdens; (ii) promote a comprehensive approach to climate risk management; (iii) support applied research, through definition of global and regional research agendas, and targeted research initiatives on priority diseases and population groups (Campbell-Lendrum et al. 2015).

2.3.1.1 Risk Maps

A risk map can aid in understanding the relationship between the diseases and the climate. Figure 1 shows how a risk map was produced to understand the relationship between climate and malaria in Eritrea. This analysis indicates that malaria incidence exists across a rather sharp spatial gradient in Eritrea and that this variation is driven by climatic factors. It is clear that malaria is high in the western part of Eritrea and peaks in September–October immediately after the July–August rainy season. Conversely, in the highlands of Eritrea (central green part), malaria incidence is low because of low temperatures at the high altitudes. On the east coast, there are some areas where malaria peaks in January because of rainfall that occurs occasionally in December.

Fig. 1
figure 1

A climate risk map showing the spatiotemporal stratification of malaria incidences in Eritrea at district level. Source: Ceccato et al. (2007)

2.3.1.2 Early Warning System

The WHO has developed a framework for creating an early warning system for malaria (DaSilva et al. 2004), which can be used as a framework for floods, natural disasters, droughts, and other events that have a relationship with climate and environmental factors. The framework is composed by four components:

  1. 1.

    Vulnerability assessment

  2. 2.

    Climate forecasting

  3. 3.

    Monitoring of climate and environmental factors, including precipitation, temperature, presence of vegetation, or water bodies that influence mosquito development

  4. 4.

    Case surveillance

2.3.2 Accessing Quality Data Through Earth Observations

Decision-makers and researchers often face a lack of quality data required for optimal targeting the intervention and surveillance of vector-borne diseases. The decisions are critical as they impact on the lives of many people: “bad data create bad policies”. Climate data and information—whether station- or satellite-generated—can increasingly be accessed freely online. Station data, of varying quality, can typically be obtained from a country’s National Meteorological Service (NMS). However, station data are not always available. Some of the station data provided by the NMS are freely available through the Global Telecommunication System (GTS) but often lack of the required spatial resolution needed.

Satellites provide raw global data that are continuously archived. In many cases the raw data may be free, but not all interfaces allow free access to their archived data. Sources for satellite-generated climate data are varied, and a selection is provided below. The following are likely to be the most useful of the freely available satellite-based estimates.

2.3.2.1 Precipitation
  • Global Precipitation Climatology Project (GPCP) combines satellite and station data. The monthly data extend from 1979 onwards. This product has a low (2.5°) spatial resolution, but is of interest when creating long time series to understand trends in past precipitation.

  • Climate Prediction Center (CPC) Merged Analysis of Precipitation (CMAP), similar to GPCP, combines satellite and station data to produce 5-day and monthly aggregations at a 2.5° spatial resolution.

  • CPC MORPHing technique (CMORPH) provides global precipitation estimates at very high spatial (8 km) and temporal (30 min) resolution. This product is suitable for real-time monitoring of rainfall, provided a long history is not required, as data are only available from January 1998.

  • Tropical Rainfall Measuring Mission (TRMM) provides monthly estimates of precipitation in the tropics at 0.25° spatial resolution. They are available from January 1998, with a latency of about a month or more. The TRMM mission ended in 2015.

  • Global Precipitation Measurement (GPM) provides estimates of precipitation globally. They are available from March 2014 to present at 0.1° spatial resolution. The GPM is an extension of the TRMM rain-sensing package.

  • Enhancing National Climate Services (ENACTS) program combines all available rain gauge data from the National Meteorology Agencies of Ethiopia, Gambia, Ghana, Madagascar, Mali, Rwanda, Tanzania, and Zambia with satellite data for the last 30 years with high spatial resolution (4 km) and a 10-day temporal resolution. ENACTS is expected to expand other countries in Africa.

  • Climate Hazards Group InfraRed Precipitation with Station (CHIRPS) data are produced using a similar technique to that of ENACTS. CHIRPS provides daily rainfall from 1981–present at a 0.05° resolution and covers 50°S–50°N. More details of CHIRPS are given in Sect. 2.3.3.

2.3.2.2 Temperature

In many parts of the world, the spatial distribution of weather stations is limited and the dissemination of temperature data is variable, therefore limiting their use for real-time applications. Compensation for this paucity of information may be obtained by using satellite-based methods. The estimation of near-surface air temperature (Ta) is useful for tracking vector-borne diseases. It affects the transmission of malaria (Ceccato et al. 2012) in the highlands of East Africa. Air temperature is commonly obtained from synoptic measurements from weather stations. However, the derivation of Ta from satellite-derived land surface temperature (Ts) is far from straightforward. Studies showed that it is possible to retrieve high-resolution Ta data from the MODIS Ts products (Vancutsem et al. 2010; Ceccato et al. 2010). Ts is available from MODIS dating to ~2000 and from Visible Infrared Imager Radiometer Suite (VIIRS) since 2012 at a 1 km spatial resolution. Separate estimates for daytime and nighttime temperatures are available, from which daily maximum and minimum air temperature estimates can be derived.

2.3.2.3 Vegetation

Remote sensing can be used to distinguish vegetated areas from other surface covers. Various vegetative properties can be gleaned from indices such as the NDVI, including, but not limited to, leaf area index, biomass, and greenness. Practitioners can access data on vegetation cover through the following sources:

  • Global NDVI from legacy NASA sensors is available from NASA’s Global Inventory Monitoring and Modeling Studies (GIMMS) for the period from 1981 to date. The dataset has been shown to be valid in representing vegetation patterns in certain regions (but not everywhere) and should be used with caution (Ceccato 2005).

  • MODIS NDVI and enhanced vegetation index (EVI) are available for 16-day periods from April 2000 at 250 m resolution. The NDVI is an updated extension to the Global NDVI.

2.3.3 Improving Data Quality and Accessibility

To address the spatial and temporal gaps in climate data as well as the lack of quality-controlled data, approaches are being developed based on the idea of merging station data with satellite and modeled data. Some of these methods are developing platforms in which the now-quality-improved data can be accessed, manipulated and integrated into the programs of national-level stakeholders and international partners.

2.3.3.1 ENACTS Approach

The IRI has been working with national meteorological agencies in Africa (Ethiopia, Gambia, Ghana, Madagascar, Mali, Rwanda, Tanzania, and Zambia) to improve the availability, access, and use of climate information by national decision-makers and their international partners. The approach, ENACTS, is based on three pillars (Dinku et al. 2011, 2014a, b):

  1. 1.

    Improving data availability: Availability of climate data is improved by combining quality-controlled station data from the national observation network with satellite estimates for rainfall and elevation maps and reanalysis products for temperature. The final products are datasets with 30 or more years of rainfall and temperature for a 4 km grid across the country.

  2. 2.

    Enhancing access to climate information: Access to information products is enhanced by making information products available online.

  3. 3.

    Promoting the use of climate information: Use of climate information is promoted by engaging and collaborating directly with potential users.

By integrating ground-based observations with proxy satellite and modeled data, the ENACTS products and services overcome issues of data scarcity and poor quality, introducing quality-assessed and spatially complete data services into national meteorological agencies to serve stakeholder needs. One of the strengths of ENACTS is that it harnesses all local observational data, incorporating high-definition information that globally produced or modeled products rarely access. The resulting spatially and temporally continuous data sets allow for the characterization of climate risks at a local scale. ENACTS enables analysis of climate data at multiple scales to enhance malaria control and elimination decisions.

2.3.3.2 CHIRPS Approach

A similar approach has been developed by CHIRPS, a near-global rainfall dataset covering 1981-present (http://pubs.usgs.gov/ds/832/pdf/ds832.pdf). CHIRPS incorporates 0.05° resolution satellite imagery with in situ station data to create gridded rainfall time series for trend analysis and seasonal drought monitoring. Two CHIRPS products are produced operationally: a rapid preliminary version, and a later final version. The preliminary product, which uses only a single station source, GTS, is available for the entire domain 2 days after the end of a pentad. The final CHIRPS product takes advantage of several other stations sources and is complete sometime after the 15th of the following month. Final monthly, dekad, pentad, and daily products are calculated at that time.

2.3.4 Examples of Analysis

2.3.4.1 Inundation Products for Leishmaniasis

Surface inundation datasets have been developed to examine inundation dynamics (McDonald et al. 2011) using the QuikSCAT and Advanced Microwave Scanning Radiometer—Earth Observing System (AMSR-E) active/passive microwave datasets over the period 2004–2009 for the South Sudan (Schroeder et al. 2010). Three environmental variables (NDVI, precipitation, and inundation) aided in determining whether wet or dry years were more conducive to the transmission of leishmaniasis.

Inundation during April–July also exhibited a strong inverse relationship with VL cases in SONDJ. Results are typified by the Lankien Medical Center analysis where below average inundation during April displays an inverse relationship with VL cases in the following SONDJ (Fig. 2).

Fig. 2
figure 2

Monthly inundation anomalies (AMJJ) and VL cases summed over SONDJ for Lankien Medical Center in Jonglei State

2.3.4.2 Water Bodies Products for Malaria

Using Landsat images at 30 m resolution, it is possible to map small water bodies where mosquitoes will breed and transmit diseases such as malaria, dengue fever, chikungunya, West Nile Fever and Zika (Pekel et al. 2011). This is done by combining the middle infrared channel (sensitive to water absorption), the near-infrared channel (sensitive to bare soil and vegetation canopy), and the red channel (sensitive to chlorophyll absorption). In Fig. 3, malaria data are overlaid on a Landsat image for the Rift Valley in Ethiopia. Villages with low malaria transmission are located either in the highland areas where temperature is the limiting factor or in the dry Rift Valley. Villages with high malaria transmission are located in the dry valley where green vegetation and water bodies are present.

Fig. 3
figure 3

Landsat image over the Rift Valley in Ethiopia, represented with false color combination of the middle infrared, near-infrared and red. Green indicates vegetation, blue shows water bodies, and pink–brown shows bare soils. Green dots represent villages with low malaria transmission, yellow and orange dots represent villages with medium to medium high malaria transmission, and red dots represent villages with high malaria transmission

2.3.4.3 Temperature for Malaria

Nighttime Aqua MODIS Land Surface Temperature (LST) in the highlands of Ethiopia close to the Rift Valley is overlaid in Fig. 4 with altitude lines showing that for a same altitude, temperature can be either favorable for mosquitoes breeding (>16 °C) or unfavorable (<16 °C). This information is important for the Ministry of Health to target control measure to fight malaria.

Fig. 4
figure 4

MODIS nighttime LST showing difference of temperatures according to the elevation in the highlands of Ethiopia

Time series of MODIS LST can show how the temperature varies in the highlands and therefore impacts the transmission of malaria. Integrating precipitation and temperature into a vectorial capacity model (VCAP), allows the Ministries of Health in Africa to assess the risk of malaria transmission in the epidemic zones of Africa (Ceccato et al. 2012).

3 Conclusion and Way Forward

The challenge issued by the 2007 Decadal Survey (NRC 2007): “Addressing these societal challenges requires that we confront key scientific questions related to … transcontinental air pollution … impacts of climate change on human health, and the occurrence of extreme events, such as severe storms, heat waves, …” frames a powerful approach for building a future in which Earth observation (EO) data are used to greatly improve quality of life globally through advances in air quality and human health monitoring, forecast, and warning systems. Key to this approach is to mobilize, through better EO data and associated analytical tools, modeling, and training, the capacity-building community that is reliant on EO data. The participants in a 2015 ‘NASA E2 Capacity Building Workshop’ noted that it was time for the EO-based capacity-building community to broaden the focus of current EO application programs (such as NASA’s Applied Sciences program) to tackle these critical issues through application of EO-based (orbital and non-orbital) science. The participants posed the following key questions related to health and air quality:

  • How can we better adapt to the impact of climate change on changing vector- and water-borne disease burden on vulnerable populations?

  • If capacity to build EO-based health monitoring improves around the world, how do we measure the societal impact in terms of quality of life and lives saved?

  • How can we identify the most impactful intervention strategy for endemic and epidemic diseases in order to design EO-based decision-making tools?

  • How can the use of small satellites, unmanned aerial systems, and crowd sourcing programs (citizen science) assist in building and improving more relevant health and air quality monitoring tools that currently use data from conventional satellites?

  • What type of disease-relevant and region-specific EO tools is needed to empower the health community?

  • Recognizing the inherent nexus between water and water-borne disease, how can we facilitate greater interaction between technical experts in the water and health communities?

The participants also made the following recommendations related to health and air quality:

  • A greater focus is needed on understanding how EO systems can best address the impact of climate change on future disease burden.

  • Recognizing the strong connections of water resources (availability) with water-borne diseases, water community technical experts that use EO systems and data should partner more effectively with the traditional health community.

  • There needs to be greater investment in small satellites and citizen science programs (volunteered geographic information) for health monitoring.

  • Programs need to be in place that facilitate clearer communication and trust building between the health stakeholder community and scientists who use EO data for capacity building of health institutions around the world.

  • In an effort to build durable capacity of EO systems, space agencies and other national or global organizations should identify strategic partners from philanthropic and private sector organizations with overlapping priorities that rely on monitoring of environmental and Earth science data in their operations.

NASA’s HAQ program has mapped a vision statement to define where it aspires to be in 2030 in terms of global capacity. The program will issue the following health ‘grand challenges’ to the community:

Malaria—Risk characterization models will be deployed regionally. A unified dynamic malaria risk model would be a major achievement for end users worldwide and would provide economic savings by scale and elimination of duplicative models.

Vector-borne diseases: The World Health Organization and other health officials will have access to global risk maps of infectious diseases associated with viral hemorrhagic fevers on a daily basis to aid health officials’ preparedness.

Monitoring and forecasting Harmful Cyanobacterial Blooms (HABs): Cyanobacterial harmful algal blooms are a concern for water supplies due to their potential to produce toxins in lakes and estuaries. In cooperation with NOAA, new satellite-derived products will be developed for the issuance of daily HAB bulletins to pertinent US end users.

The air quality ‘grand challenges’ for 2030 are:

Accurate ground level aerosol and constituent and ozone measurements will routinely be obtained from a combination of remote sensing, in situ observations, and models. Total columnar aerosol and ozone estimates are routinely made from satellite sensors, but these have limited value for determining human exposure to harmful pollutants.

The HAQ program will have established strong relationships with federal, state, local, and international partners to identify unique applications of satellite observations and realize their operational use. These applications will provide critical components for integration with forecast models and decision support systems. NASA’s participation in health and air quality applications research and related transition to operations activities currently performed with EPA, NOAA, CDC, and others will fill a significant niche in national capabilities and will be vital components in future domestic and international programs.

In other words, by 2030, the goal is for EO products developed under the HAQ program to be widespread, regularly accessed, and indispensable to the public and end users. The HAQ program will be known as the ‘go to’ program for information about vector-borne and infectious disease risks, environmental health risks, and dangerous air pollution episodes.