1 Introduction

There is a clear lack of balance in the literature between the modeling studies aimed at investigating the effects of climate change on crop growth and yields (e.g., Rosenzweig and Parry 1994; Tubiello et al. 2000; Challinor et al. 2010) and the available studies on expected changes in the future distribution of plant diseases (e.g., Anderson et al. 2004; Ghini et al. 2008). A possible explanation is the scarce availability of process-based models for plant disease simulation, whereas crop growth models are widely diffused in research and operational contexts. As underlined by many authors (e.g., Butterworth et al. 2010; Chakraborty and Newton 2011), this gap of knowledge is an issue modelers should urgently deal with, because even nowadays the damages caused by diseases and insect pests are responsible for approximately 50 % of the losses of the eight most important food and cash crops worldwide (Oerke 2006), thus underlining the need for assessing their future impacts. There are evidences that climate change could deeply influence the effects of plant diseases and pests on crop production (e.g., Goudriaan and Zadocks 1995; Garrett et al. 2006), by means of altered spread of some species and introduction of new pathogens and vectors, leading to uncertain dynamics of plant epidemics. An overall increase in the infection patterns of fungal plant pathogens in response to CO2 enrichment, N deposition, and changes in temperature and rainfall regimes is documented in the review by Tylianakis et al. (2008), whereas their expansions in new geographic areas together with heterogeneous responses to environmental conditions are predicted by Parmesan (2006). Mitchell et al. (2003) demonstrated in a field experiment where 16 different plant species were grown under varying levels of CO2, N treatment, and species richness that the decreased plant diversity decidedly contributed to the increase in the leaf area infected, whereas higher CO2 and N increased pathogen load in C4 and C3 grasses, respectively. An increased impact of fungal pathogens due to the lengthening of the growing season is proved by the warming experiment by Roy et al. (2004), in which six plant species were grown in plots where soil was heated and dried over the growing period. Almost all the plants grown under warmer conditions went through an increase in the amount of damage and in the number of the pathogenic species along a phenological gradient. Furthermore, climate change could influence the physiology and the degree of resistance of host plants, other than modify the growing patterns and the development rates of the biological cycles of plant pathogens (Chakraborty and Datta 2003). This could potentially lead to an increase in the number of infection events that in turn could determine an increment in the application of chemicals, thus reducing the environmental sustainability of many cropping systems (e.g., Hannukkala et al. 2007). Moreover, since plant disease epidemics often increase following a compound interest model, a slight increase in the length of the growing season in currently cold areas may have a large impact on inoculum load. These projections are in line with the available studies on climate change impacts on pathosystems. Travers et al. (2010) observed lower expression of the genes involved with disease resistance in big bluestem in response to simulated precipitation change; Chakraborty et al. (2000) noticed a higher fecundity of Colletotrichum gloeosporioides under increased atmospheric CO2 concentration; Bergot et al. (2004) predicted an expansion of Phytophthora cinnamomi in Europe as a result of higher winter survival by using a physiologically based approach.

Studies aimed at providing estimates of the future spread of plant pathogens require to clearly identify the goals and to explicitly communicate the limits. In pest risk assessment studies, the main objective is the evaluation of the environmental suitability to a specific pathogen. The key aspect to analyze in these studies is if climate conditions are successfully conducive to the fulfillment of the infection process, which represents the first phase of the establishment of an epidemic (Magarey and Sutton 2007). At the same time, the estimate of the number of potential infection events can give a quantitative measure of plant disease pressure on crops. This can be effective to the timely understanding of the challenges farmers will face with in the coming years, leading—in the most critical cases—even to evaluate the opportunity of cultivating a crop in a specific area. The aim of this paper is to assess the changes in the number of potential infection events of six pathogens of wheat, rice (Fig. 1), and grapes in Europe due to projections of climate change, via the spatially distributed application of a generic infection model driven by weather data and parameters with a clear biological meaning.

Fig. 1
figure 1

Rice field placed in Lombardy region affected by blast disease and Echinochloa crus-galli

Even if the susceptibility of the host plant to fungal diseases varies according to phenological development, we did not couple the infection model with a phenological one in order to evaluate the ecological suitability of the fungal pathogens over the whole year rather than on specific pathosystems which should be analyzed also accounting for alternate hosts. The role of such plants is recognized as crucial for the establishment of an epidemic and at the same time it is unknown under future weather conditions.

This paper represents the first attempt to provide estimates of future plant disease pressure on large areas. The information provided is on the possible criticalities European farmers and political stakeholders will have to deal with in the future, also in terms of comparison across regions as well as changes of known spatial patterns.

2 Materials and methods

2.1 Meteorological data

The database of daily weather data used in this study was specifically built within the AVEMAC project (Donatelli et al. 2012a, b) for the use of biophysical simulation models in climate change studies. This database was derived from the bias-corrected ENSEMBLES dataset of Dosio and Paruolo (2012) with the future projections of the A1B emission scenario given by the HadCM3 General Circulation Model (GCM) nested with the HadRM3 Regional Circulation Model (RCM; the realization is denoted as METO-HC-HadRM3Q0-HadCM3Q0 in the ENSEMBLES project—van der Linden and Mitchell 2009). This represents a “warm” realization of the A1B emission scenario. The A1 scenario family is based on future human activities involving a rapid economic growth, with a fast diffusion of new and efficient technologies and with broad sociocultural interactions worldwide. In specific, A1B scenario implies a balanced emphasis on all energy sources (IPCC 2007). Three target horizons were considered in this work, which are 1993–2007 (baseline), 2025–2034 (2030), and 2045–2054 (2050). Climate studies typically consider a sample of 30 years around the target horizon to characterize a given variable or to derive other data (e.g., fungal pathogen infections); in order to avoid that the short-term random fluctuations—such as daily weather variations—do not influence the outputs derived from the GCM simulations. In this study, projected time windows of 30 years around the two considered horizons (i.e., 2030 and 2050) would have resulted in a meaningless separation of the two horizons. When considering only 20 years (thereby avoiding overlap) the sample size could become small to assume that short-term weather fluctuations do not dominate over the trend. Indeed, 3 or 4 years, which are much warmer than the trend during a period of 20 years, will have stronger consequences on the average values of fungal infections than if these years occurred within a period of 30 years. To solve this problem, the ClimGen (Stockle et al. 1999) stochastic weather generator, was used to increase the sample size corresponding to each horizon. A set of 15 years from the GCM-RCM runs was used around each reference year (e.g., 2030 ± 7 years, so from 2023 to 2037), increasing the robustness of the estimate to characterize a time period. The weather generator uses these data to derive monthly parameters resuming the distribution of each weather variable for each grid cell. These parameters were then used to generate a set of 15 synthetic years for every grid cell, which have the characteristics of the 15-year period.

The hourly agro-meteorological variables needed as input for the potential infection model were generated starting from daily data. The model developed by Kim et al. (2002) was adopted to estimate leaf wetness duration, since it proved to be robust when applied under a wide range of climate conditions (Bregaglio et al. 2011). To run this model, hourly data of air temperature, air relative humidity, wind speed, and dew point temperature are needed. Hourly air temperature data were derived starting from daily maximum and minimum air temperature using the approach proposed by Campbell (1985). Dew point temperature was kept constant in a day (Running et al. 1987; Glassy and Running 1994) and computed according to Linacre (1992). Relative humidity data were estimated according to the best performing approach tested in heterogeneous climatic conditions by Bregaglio et al. (2010), as the ratio of saturation (ASAE 1988 method) and actual vapor pressure (Allen et al. 1998). Hourly wind speed was generated starting from daily wind speed data by using the stochastic approach developed by Mitchell et al. (2000).

For each time frame and for each species, simulations were carried out on the 25 × 25-km spatial units of the MARS database according to the actual crop distributions (European Communities 2008).

2.2 Model and parameterizations

The generic potential infection model developed by Magarey et al. (2005) was used in this study. It proved to be sensitive to diverse parameterizations and to effectively respond to input data variability, thus being suitable for climate change impact assessments (Bregaglio et al. 2012). It simulates the response of a generic fungal pathogen to both air temperature and leaf wetness duration with an hourly time step, by means of parameters with a clear biological meaning.

The air temperature response is modeled according to Yan and Hunt (1999) (Eq. 1):

$$ f(t)=\left( {\frac{{{T_{\max }}-T}}{{{T_{\max }}-{T_{\mathrm{opt}}}}}} \right){{\left( {\frac{{T-{T_{\min }}}}{{{T_{\mathrm{opt}}}-{T_{\min }}}}} \right)}^{{{{{\left( {{T_{\mathrm{opt}}}-{T_{\min }}} \right)}} \left/ {{\left( {{T_{\max }}-{T_{\mathrm{opt}}}} \right)}} \right.}}}} $$
(1)

where f(t) (0–1; unitless) is the temperature response function; T (°C) is the hourly air temperature; T min, T max, and T opt (°C) are the minimum, maximum, and optimum temperatures for infection, respectively. The air temperature response is then scaled to the wetness duration requirement according to Eq. 2:

$$ W(t)=\left\{ {\begin{array}{*{20}c} {\frac{{\mathrm{W}{{\mathrm{D}}_{\min }}}}{f(t) }} & {\frac{{\mathrm{W}{{\mathrm{D}}_{\min }}}}{f(t)}\leq \mathrm{W}{{\mathrm{D}}_{\max }}} \\ 0 & {\mathrm{elsewhere}} \\ \end{array}} \right. $$
(2)

where W(t) (0–1; unitless) is the wetness response function, WDmin (hours) is the minimum leaf wetness duration for infection, f(t) (0–1; dimensionless) is the temperature response function (Eq. 1) and WDmax (hours) is the maximum leaf wetness duration requirement. The model then takes into account the impact of a dry period via a critical interruption value (D50), as reported in Eq. 3:

$$ {W_{\mathrm{sum}}}=\left\{ {\begin{array}{*{20}c} {{W_1}+{W_2}} & {D<D50} \\ {{W_1},\,{W_2}} & {\mathrm{elsewhere}} \\ \end{array}} \right. $$
(3)

where W sum is the sum of the surface wetting periods and W 1 and W 2 are two wet periods separated by a dry period (D, hours). According to this equation, if D > D50 then the model considers the two wet periods as separated. At each hour, the model adds a cohort of spores if the leaf is wet and f(t) > 0, and it considers that an infection event is verified if the value of W sum for a cohort ranges between WD min and WD max. Detailed model description is provided by Bregaglio et al. (2012).

Six air-borne pathogens were simulated (Table 1), whose thermal and moisture requirements were derived from literature. Aiming at applying the model on large areas, no specific calibration was performed in this study and the average of the parameters values found in literature was used as input to the model. Even if this implies that neither genetic diversity nor evolutionary potential of the pathogens population is considered in this study, this choice allowed to use the model to provide quantitative estimates of fungal infections based on reliable published data.

Table 1 Pathogens, crops, and alternate/alternative hosts, model parameterizations, and relative sources of information. In case of multiple values for the same parameter, the average was given as input to the model. T min, T opt, and T max are minimum, optimum, and maximum temperature for fungal pathogen infection (degrees Celsius), WD min, and WD max are minimum and maximum surface wetness periods required for an infection event (hours), D50 is the number of dry hours that caused the end of an infection event (hours)

The choice of the pathogens was aimed at identifying relevant biotic stressors on key cultivated herbaceous (i.e., wheat and rice) and tree (i.e., grape) species in Europe, characterized by heterogeneous geographical distribution and by different thermal requirements. Two pathogens for each crop were selected in order to analyze the dynamics of the pressure of multiple organisms on the same crop. Simulations were carried out by considering the current geographic distribution of the three species (European Commission-Joint Research Centre MARS database) and over the entire year, to focus on the relationships between the pathogen and the environment. This choice is supported by the evidence that all the simulated pathogens can develop even on wild plants and weeds (Table 1), thus allowing them to survive from one growing season to the following one and to contribute to the establishment of an epidemic on the cultivated crop. The role of these bridging hosts is recognized as crucial both for the production of additional inoculum load for the following season and to harbor the dormant stages of the pathogens (Dinoor 1974), other than being likely significant under future climate conditions (Dobson 2004).

3 Results and discussion

Results are presented as percentage difference of the total number of potential infection events simulated in the 2030 and 2050 time frames with respect to the baseline results.

3.1 Wheat

Projections for the 2030 time frame for Puccinia recondita (brown rust, Fig. 2a) showed a general increase in the number of infection events throughout Europe, more pronounced in the key wheat-producing areas of the Northern part of Germany, in Great Britain, in Benelux, and in some areas of Spain, France, and of the coastal part of Croatia (+100 %). The remaining areas of Europe will be interested by a lower increase in disease pressure, with small areas in the Mediterranean part of Morocco in which the pathogen will not change its infection pattern. Moving to 2050 time frame (Fig. 2b), simulations indicate a deep worsening in the disease pressure on wheat, with the number of infections increased by more than 100 % in most of the wheat producing countries, except in Italy, Southern Spain, and in the countries of Eastern Europe. As already discussed for the 2030 time frame, Moroccan regions will be less interested by the increase in the infection events.

Fig. 2
figure 2

Differences in the number of potential infection events simulated in the A1B climate scenario compared to the 1993–2007 reference scenario (%); P. recondita in 2030 a and 2050 b time frames shows a general increase in the number of infections; P. striiformis in 2030 c and 2050 d time frames maintains the current infection levels

Simulations performed for Puccinia striiformis (yellow rust) highlighted a completely different scenario, showing for the 2030 time frame (Fig. 2c) a stationary number of potential infections for most of the regions in Central Europe (−5 to +5 %), with the exception of some spot areas in France. A slight increase in the number of infection events (+5 to +20 %) was observed in Italy, in Southern Spain, and in South-Eastern countries, especially in Hungary, Croatia, and Serbia. According to the simulations, Poland, the European part of Russia, and some areas in Morocco will experience a lower number of infections (−5 to −20 %). For the 2050 scenario (Fig. 2d), the disease pressure is forecasted to decidedly increase in the Central and Eastern parts of Europe, whereas in France and Italy it is expected to decrease.

The large differences between the responses simulated for the two pathogens can be due to their very diverse thermal requirements, especially concerning the optimum conditions for infection development. P. recondita, which has an optimum temperature of 25 °C, will be decidedly favored by warmer conditions, especially in the European countries characterized by a continental climate, whereas P. striiformis, which is more adapted to a temperate environment, will not present a clear benefit from the temperature increase, except in the European countries with a colder climate. The reduced suitability of yellow rust to increased temperature is more pronounced in the Mediterranean countries for the 2050 time frame, whereas the decrease in the number of potential infections simulated for 2030 in Russia and Germany can be due to the high suitability of the current climate to yellow rust development (Schröder and Gabriele 2001; Tian et al. 2004). In these areas, even a limited increase in air temperature can strongly affect the completion of the infection events. This confirms the results achieved by the ENDURE EU-FP6 project, in which expert judgments and experimental trials were carried out in eight European countries to investigate the impact of biotic constraints of wheat, indicating the highest value of average yield losses in Germany (−2.5 % in the period from 2003 to 2007).

The projections for P. recondita, currently causing significant yield losses in Europe (i.e., ranging from 3 % in Germany to 15 % in France; ENDURE EU-FP6 project; Jørgensen et al. 2010), indicate that this pathogen could be even more dangerous in the coming years, whereas P. striiformis pressure is expected to remain unchanged or even decrease.

3.2 Rice

Results of the simulations carried out for Bipolaris oryzae (brown spot) in 2030 time frame (Fig. 3a) indicate a marked increase in the number of potential infection events compared to the baseline in almost all the European rice districts and especially in the Italian one (+100 %). The only exceptions are represented by the Hungarian district, for which an unchanged disease pressure was simulated, and by some spot areas of Portugal, which presented a decreased number of infections. For the 2050 time frame (Fig. 3b), simulations depicted conditions more favorable for the pathogen. The only rice districts which experienced a substantially unvaried number of infection events are located in Spain.

Fig. 3
figure 3

Differences in the number of potential infection events simulated in the A1B climate scenario compared to the 1993–2007 reference scenario (%); B. oryzae in 2030 a and 2050 b time frames; P. oryzae in 2030 c and 2050 d time frames. The number of infection events increases for the two pathogens, especially for the former

According to the simulations performed for Pyricularia oryzae (blast disease) for the 2030 time frame (Fig. 3c), the increase in disease pressure will be lower than the one depicted for the other rice pathogen, although a homogeneous increase in the number of infection events is predicted throughout Europe (+5 to +20 %), in some cases even above 20 % (Italy and Hungary). As for B. oryzae, projections for 2050 (Fig. 3d) present a raising disease pressure in the whole Europe, even if the increase of the number of infections remains below the 100 %.

Both the pathogens are expected to intensify the pressure on the crop, although with diverse severity. The higher intensification of the number of potential infections simulated for B. oryzae is explained by the different requirements of the two tropical pathogens in terms of air temperature and leaf wetness duration. According to the data available in literature, B. oryzae presents higher optimum (27.5 °C) and maximum (35 °C) temperature for infection development with respect to P. oryzae (25 and 32 °C, respectively). This implies that the former would explore more favorable conditions, especially in the 2050 time frame. For what concerns their leaf wetness duration requests, B. oryzae needs a longer time to fulfill infection requirements (10 versus 4 h), whereas it is less sensitive than P. oryzae to a dry interruption event (13 versus 4 h).

These results suggest that B. oryzae could replace P. oryzae in the role of most dangerous rice pathogen in Europe. In fact in Italy, which is the larger rice producing country in the European Union, P. oryzae is currently the most impacting pathogen, requiring chemical control in the 75 % of the Italian rice acreage (Food Chain Evaluation Consortium 2011). The simulated intensification of the pressure of brown spot disease is in agreement with Moletti et al. (2011), who underlined that it is recently increasing in Italy also because its development is favored by the Akiochi nutritional disorder, in turn promoted by raising temperatures.

3.3 Grape

In general, the simulations performed with Plasmopara viticola (downy mildew) in the 2030 time frame (Fig. 4a) clearly indicate a slight increased disease pressure all over Europe (+5 to +20 %), with the only exceptions represented by some areas in Spain, Germany, France, and Italy, where the number of infection events remains at the same level of the reference scenario. According to the simulations carried out in 2050 (Fig. 4b), the number of events is expected to remain unchanged with respect to the one simulated for 2030 in most of the European grape producing areas, with a small increase only in the Northern part of Spain. In other zones in which grape is intensively cultivated (France, Southern Italy, and Portugal), the simulated disease pressure is expected to remain constant at the current levels. Results for Botrytis cinerea (bunch rot) showed a very heterogeneous geographical pattern, with a high increase in the number of infections (+70 %) in the 2030 time frame (Fig. 4c) along the coastal parts of the Mediterranean countries and in the French Aquitaine region. Other parts of Europe (e.g., Tuscany, most of the Spanish regions and the Balkans) are forecasted to maintain a similar level of pressure compared to the current one, whereas there are spot areas placed in Hungary, Romania, France, and Portugal that will likely experience a decrease in the number of potential infection events even above than 100 %. For 2050 time frame (Fig. 4d), a similar scenario is depicted, with a general slight intensification in the number of events in the Balkans and in Portugal. As already discussed for the 2030 time frame, large areas in Europe will experience an unvaried disease pressure (e.g., Spain, Central Italy, and Greece), whereas in other regions the number of infections is expected to decrease compared to the baseline (e.g., some areas in Germany and France).

Fig. 4
figure 4

Differences in the number of potential infection events simulated in the A1B climate scenario compared to the 1993–2007 reference scenario (%); P. viticola in 2030 a and 2050 b time frames shows a general slight increase; B. cinerea in 2030 c and 2050 d time frames presents heterogeneous geographical patterns

The responses of the two grape pathogens to climate change are very different. Simulation results showed an overall stationary disease pressure of downy mildew for the 2030 time frame, probably due to the broad range of temperature in which the pathogen can develop (i.e., T min = 1 °C; T max = 30 °C) and to its low requests in terms of leaf wetness duration (two wet hours are enough for the fulfillment of an infection event). The different situation depicted for B. cinerea is probably due to the smaller range of temperatures suitable for its development (i.e., T min = 10 °C; T max = 35 °C), with optimum temperature of 20 °C combined with the minimum wetness duration requirement fixed at 4 h. In the coastal areas of the European Mediterranean countries, characterized by a warmer climate, the generated scenario determined more humid conditions, which in turn led to more suitable environmental conditions for the pathogen development. In the areas of Europe in which the number of infections remained unvaried or decreased, the lower number of leaf wetness hours in the climate scenario considered and the increase in air temperatures above the optimum for infection of B. cinerea played a major role. Simulations suggest that this pathogen, which often imposes price penalties even if only 3–5 % of the grapes is affected (Hill et al. 2010) because of the associated decline in both grape yield and wine quality (Nair and Hill 1992), will continue to be a relevant problem for European farmers, although with different dynamics. This could also lead to the need of modifying the chemical control management in some areas by means of using new active principles, given the great ability of this pathogen to quickly adapt to new chemicals by developing resistant strains (Rosslenbroich and Stuebler 2000).

4 Conclusions

Assessing the dynamics of diseases pressure under climate change scenarios is crucial to understand the challenges farmers will face with in the coming years, which can even lead to evaluate the opportunity of cultivating a crop in a specific area. The results of this study indicate a general increase in infection events for most of the pathogens considered, in particular for P. recondita and B. oryzae. For the other pathogens, projections are more heterogeneous in terms of both space and time. On the whole, moving from the 2030 to the 2050 time frame, an increase in the number of potential infection events is expected. These results are in line with the available studies indicating an increase of plant fungal infections under warmer conditions (e.g., Dale et al. 2001; Harvell et al. 2002) and with the heterogeneity of pathogen responses observed in experimental trials (Mitchell et al. 2003; Roy et al. 2004). This study then represents a first attempt to provide quantitative estimates of such dynamics, via the application of a simple model based on parameters with a clear physiological meaning. Policy makers can use the outcomes of this study to be aware of possible future challenges to face when planning regional or local policies in terms of disease pressure and consequently of chemical control. Also, researchers could be interested in refining model parameterization by testing different ecotypes of the pathogens simulated, or even of different pathogens.

The limits of this study can be summarized as follows: the lack of consideration (1) of the evolutionary potential of the pathogen populations, which can lead to more adapted pathotypes and (2) of the tight interactions between the pathogens and the host plants during the epidemic development, being this study focused on the infection phase. This lead to the lack of consideration of other key aspects which can determine the establishment of an epidemic, such as the time pathogens move in and away from the infected areas and the local wintering pathogens quantity, which should be analyzed to give an integrated assessment of plant disease impacts under climate change scenarios. However, the simulation of these aspects would require specific models and detailed information about the epidemiology of the pathogens, therefore including these processes in this study would lead to add a certain layer of complexity which could mislead the original message. Other limits are represented by the inner uncertainty related (3) to the model parameterization and (4) to the reliability of air humidity estimates, which have an associated uncertainty due to the weather scenario: the realizations of the same emission scenario may differ substantially for spatial patterns of precipitation, while showing a similar air temperature increase.

This exploratory study showed that the response of plant pathogens to climate scenarios can be differentiated across Europe and requires a case-by-case evaluation. A future development of this work will be the implementation of the potential infection model in a modeling solution capable to evaluate the whole crop-disease interaction. This will make available an operational tool to be coupled with cropping system models, to improve the evaluation of the impact of climate change scenarios on crop production levels. Such modeling solution would be a clear step forward to the traditional studies which do not consider dynamically biotic constraints to production, enabling also more articulated adaptation studies.