Introduction

Plants represent the main source of energy and nutrients for pathogens and herbivores and they compete with other plants for light, nutrients and water. Both biotic and abiotic conditions change in time and space and most plant enemies are mobile, but plants lack rapid movements that would allow them to abscond from their enemies or to forage for more favourable conditions. How can plants cope with the threats they face in a changing environment? The answer is easy: they change themselves.

Phenotypic plasticity is the ability of an organism to express different phenotypes in response to changing environmental conditions (Sultan 2000; Agrawal 2001; Van Buskirk and Steiner 2009). An environmental factor that is particularly difficult to anticipate is the presence of enemies. Plants have, thus, evolved two major defence strategies to cope with this threat: resistance and tolerance. Resistance refers to all plant traits that reduce encounters with the enemy of the plant, whereas tolerance describes the capacity of a plant to minimise fitness losses without reducing encounter rates (Núñez-Farfán et al. 2007; Oliver et al. 2009).

Within the concept of resistance, plants have two general options: direct and indirect ones. Well-known examples of traits that directly affect the enemy comprise thorns, spines, trichomes, toxic or repellent secondary compounds, pathogenesis-related proteins and compounds that interfere with the development of the enemy. Alternatively, plants can make use of tritrophic interactions and produce traits such as extrafloral nectar (EFN) and volatile organic compounds (VOCs), which enhance the presence of carnivores, thereby reduce herbivore pressure and ultimately cause an ‘indirect defence’ (Price et al. 1980; Heil 2008).

Most of the underlying traits exhibit a certain level of phenotypic plasticity, because they are not expressed constitutively but in response to a first encounter with herbivores or pathogens (Karban and Baldwin 1997). Local infection or infestation likely indicate future attack of the rest of the plant and most resistance traits are, therefore, induced systemically, in as yet unaffected organs. However, a plant-wide resistance that is inevitably expressed after every local attack comes at the risk of investing in a defence that is not required. This problem is reduced by a second layer of phenotypic plasticity in the alarm responses of plants: priming. Primed tissues show no enhanced levels in their phenotypic resistance, but they express resistance faster and stronger once they are infected or infested (Goellner and Conrath 2008). Similarly, tolerance normally requires the re-allocation of resources and other general adjustments in the plant’s morphological and biochemical phenotype, which are induced in response to herbivore attack. Tolerance does, thus, also rely to a large degree on phenotypic plasticity.

The present review is focused on plastic responses of plants to encounters with their enemies and discusses the resulting ecological and evolutionary benefits. Although phenotypic plasticity also comprises responses to changes in the abiotic environment, I focus on inducible traits, which enhance their expression level in response to an attack by a biological enemy. I discuss the most recent developments in this field and point out which questions remain open and will require more attention in the future.

Direct defences

Anatomical and morphological traits that affect the resistance to herbivores and that are induced in response to herbivory are, for example, trichomes (Agrawal 1998; Traw and Bergelson 2003; Björkmann et al. 2008), spines and thorns (Gowda 1997), leaf thickness and/or cell wall content (Gomez et al. 2008), extrafloral nectaries (Mondor and Addicott 2003; Pulice and Packer 2008) and even climbing by lianae, which helps these plants to reach spaces with lower enemy pressure (González-Teuber and Gianoli 2008). Examples of chemical resistance traits that respond to enemy attack comprise traumatic resin accumulation in conifer stems (Miller et al. 2005) and secondary compounds such as alkaloids (Baldwin 1988), glucosinolates (Agrawal 1998), terpenoids (Gershenzon and Dudareva 2007), or phenolic compounds (Ayres et al. 1997; Shirley 1998). Plants can also harm herbivores by producing proteinase inhibitors, which reduce protein digestion by the insect (Green and Ryan 1972; Schilmiller and Howe 2005), or fend them off with repellent odours (Bernasconi et al. 1998; de Moraes et al. 2001; Kessler and Baldwin 2001). Likewise, phytoalexins (Hammerschmidt 1999) and pathogenesis related (PR) proteins (Van Loon et al. 2006) underlie induced plant resistance to pathogens (Heidel and Dong 2006; Traw et al. 2007).

Inducing such direct resistance traits can represent a major alteration of the phenotype of a plant. Total alkaloid content in damaged wild tobacco plants was more than five times higher than in undamaged controls (Baldwin 1988), thorn numbers of Acacia plants differed by 25% between plots with and without herbivory by large mammals (Huntzinger et al. 2004), insect feeding increased trichome numbers on Lepidium virginicum by ca 60% (Agrawal 2000b), induced young leaves of damaged willows had three times more trichomes as compared to controls (Björkmann et al. 2008) and the amounts of proteinase inhibitors can increase tenfold in response to induction (Brown 1988). Changes can also be of qualitative rather than quantitative nature. For example, caterpillar feeding increased the chemical diversity of glucosinolates in L. viriginicum by ca. 50%, although the overall amount remained stable (Agrawal 2000b). The activities of the PR-enzymes, peroxidase and chitinase, in induced plants were more than three times higher than in controls (Dietrich et al. 2004), phytoalexin content can increase in response to infection from undetectable amounts to more than 10% of the total dry weight (Ebel 1986) and up to 10% of the soluble protein fraction in cultured tobacco leaf tissue consisted of a single PR-protein (Felix and Meins 1985), although most PR-proteins are virtually absent from healthy plants (Van Loon et al. 2006). Since induced resistance requires a shift from primary to secondary metabolism (Schwachtje and Baldwin 2008; Heil and Walters 2009), the induction process might alter the expression of hundreds or even thousands of genes (Maleck et al. 2000; Hermsmeier et al. 2001; Cheong et al. 2002; Kessler and Baldwin 2002). In summary, resistance expression significantly alters a plant’s transcriptome and, thus, its morphological, chemical and metabolic phenotype.

Indirect defences

Whereas direct resistance can act against both herbivores and pathogens, indirect defences have as yet mainly been described in the context of plant-herbivore interactions. Plants damaged by herbivores increase the emission of VOCs, which purportedly signal the presence of prey to predators or parasitoids, or they increase the secretion of EFN, which attracts ants and other carnivores to the plant and thereby enhances predation pressure on herbivores (for reviews see Dicke 1999; Tumlinson et al. 1999; Heil 2008).

The herbivory-induced release of VOCs has been described for species representing a broad spectrum of life histories and belonging to unrelated families (Gershenzon and Dudareva 2007; Unsicker et al. 2009; Heil and Karban 2010). It is therefore most likely that the induction of VOCs after herbivore feeding is taxonomically and ecologically very common. EFN is known from more than 300 genera of plants and its defensive effect has been demonstrated in hundreds of field studies (Heil 2008; Chamberlain and Holland 2009). EFN secretion usually represents an induced response to leaf damage (see Heil 2008, 2009 and references cited therein). The only exception reported so far concerns Mesoamerican Acacia plants, which secrete EFN constitutively because they are engaged in obligate, symbiotic mutualisms with defending ants (Heil et al. 2004). Such so-called myrmecophytes (obligate ant-plants) provide ants with nesting space and food resources and gain constitutive inhabitation by defending ants (Heil 2008). Obligate myrmecophytes have evolved independently in all tropical regions and appear to represent the only case of a constitutive indirect defence. However, their evolutionary stabilisation also requires a certain level of phenotypic plasticity on the side of both plant and associated ant (see below).

Changes as dramatic as those described above for direct defences also apply to the expression of indirect defences. Most VOCs are produced de novo after herbivore damage (Pichersky and Gershenzon 2002; Gershenzon and Dudareva 2007), which, thus, completely changes the olfactory phenotype of a plant and the fauna on and around the induced plant, as far as the animal species considered respond to plant volatiles. For example, patches of tomato plants that were experimentally damaged using Spodoptera littoralis-caterpillars to induce their direct and indirect defences contained significantly more carnivorous lady beetles and less herbivorous flea beetles than control patches (Rodríguez-Saona and Thaler 2005). Similarly, the numbers of ants that were present on the Euphorbiaceae, Macaranga tanarius, increased almost three-fold in response to the induction of EFN secretion, while herbivore numbers decreased ca. ten-fold on the same plants (Heil et al. 2001). A further level of phenotypic plasticity results, thus, from the inherent interactions among plants and animals. In fact, plants attract or repel large groups of animals and thus actively control (parts of) the fauna that is present on them or in their immediate surroundings. I suggest that the fauna that is under control of the plant should be regarded as a component of its ‘extended phenotype’.

Both parasitoids and ants have high capacities of associative learning. For example, parasitoids quickly learn to associate odours with food supply or oviposition success (Turlings et al. 1990; Petitt et al. 1992; Stapel et al. 1997) and also use landmarks for host localisation (Van Nouhuys and Kaartinen 2008). Ants chemically mark trails that lead to attractive food sources or quickly recruit their nestmates (Hölldobler and Wilson 1990). The carnivores involved in indirect defence of plants are phenotypically plastic themselves and can intensify or cease their response, depending on the benefits obtained. Plants have the capacity to make use of this plasticity of the animals for their own purposes. In consequence, the system consisting of interacting and phenotypically plastic plants and animals exhibits a high degree of flexibility in its responses to environmental changes. Even the induction state of the surrounding vegetation affected numbers of herbivorous and carnivorous animals on induced tomato plants (Rodríguez-Saona and Thaler 2005).

Indirect defences might have evolved as induced traits mainly to enhance the reliability of the underlying signals from the point of view of the animal (see below). As long as the phenotypic plasticity on both sides has a genetic basis, these systems are subject to co-evolutionary processes. For example, mutualisms are assumed to be evolutionarily destabilised by non-reciprocating ‘parasites’ or ‘cheaters’ (Bronstein 2001) but reciprocal and phenotypically plastic changes can allow an adjustment of rewards and services (Agrawal 2001). Iterated interactions among phenotypically plastic partners can stabilise mutualisms (Agrawal 2001; Heil et al. 2009) and, thus, indirect defence strategies.

Tolerance and (over) compensation

Even functioning defence mechanisms do not completely prevent damage. The survival and fitness of plants depend, thus, also on the capacity to overcome the effects of tissue removal (Juenger and Lennartsson 2000). Tolerance is the ability to maintain fitness in the presence of stress because it minimizes the decline in fitness from the level achieved in a favourable environment to that produced under stressful conditions (Simms 2000; Núñez-Farfán et al. 2007; Oliver et al. 2009). Even overcompensation (that is, increased fitness of damaged in comparison to undamaged plants) has been reported, leading to the assumption of ‘plant-herbivore mutualisms’ (Agrawal 2000a).

Because tolerance does per definition not affect the encounters with the enemies of the plant (but see Utsumi et al. 2009 for a recent study that challenges this view) and should, therefore, have no direct effect on their fitness, tolerance is not likely to be subject to a coevolution between plants and their enemies (Espinosa and Fornoni 2006). The underlying mechanisms comprise increases in photosynthetic capacity, growth rates and nutrient uptake by roots, mobilisation of stored reserves, activation of dormant meristems and related changes in allocation patterns, altered flowering phenology and so on (see Núñez-Farfán et al. 2007 for a review). Altered branching patterns have been reported also in response to bacterial infections of Arabidopsis (Korves and Bergelson 2003) and rates of photosynthesis increased in uninfected leaves of bean plants that were infected by rust fungi (Murray and Walters 1992). Unfortunately, little more is known on the mechanisms that allow plants to tolerate pathogen infection.

Evidence for and against compensation and the significance of certain tolerance mechanisms is mixed. Several studies reported an increased photosynthetic capacity, or rate, in leaves that had been re-grown after herbivore feeding (Heichel and Turner 1983; Morrison and Reekie 1995; Johnston et al. 2007) but others found no or negative effects of leaf damage on photosynthesis (Aldea et al. 2006; Nabity et al. 2006; Reddall et al. 2007). The latter observations are in line with the general negative effects of the wound hormone, jasmonic acid, on the expression of genes that are related to the primary metabolism, including photosynthesis (Creelman and Mullet 1997).

Resistance and tolerance were often predicted to be subject to evolved trade-offs. Both allocation costs (Pilson 2000) and genetic costs (Simms and Triplett 1994) of tolerance have been reported and tolerance appears more important for less resistant genotypes than for highly resistant ones and vice versa. Imagine a completely tolerant genotype that is not affected at all by enemies: for this genotype, no investment in resistance would make sense (Núñez-Farfán et al. 2007 and references cited therein). Interestingly, these trade-offs were found in some systems (Fineblum and Rausher 1995; Mauricio et al. 1997; Strauss and Agrawal 1999; Boege 2005) but not others (Agrawal et al. 1999; Cipollini 2007; see Núñez-Farfán et al. 2007 for additional references) and may in fact even be subject to environmental influences. In a transplantation experiment among two geographically separated populations of Datura stramonium, Fornoni et al. (2003) found negative correlations between both factors in one native population and in plants from the second population that had been transplanted to the site of the first population. By contrast, in plants from the second population at its native site no genetic correlation occurred between resistance and tolerance, whereas plants from the first population grown at the second site even exhibited a positive genetic correlation among both traits (Fornoni et al. 2003).

Plant growth rates and fitness after leaf damage range from being severely reduced to overcompensation, studies on the underlying physiological mechanisms revealed contradictory results and the evidence for and against a trade-off between resistance and tolerance is mixed as well. What causes these contradictory observations? A central difficulty arises from the quantification of tolerance, which applies when the effect of a certain level of stress is less drastic than what would be expected without the tolerance mechanism. How can the ‘expected’ effect size be determined? Whereas overcompensation is obvious as soon as herbivory or infection increase rather than decrease plant fitness, tolerance can only be quantified in a comparative manner, that is, by comparing the supposedly tolerant with other, supposedly less tolerant plant genotypes, or with the same genotype under different conditions. Although attempts have been made to quantify tolerance more directly, the most common estimate of tolerance is the slope of a regression between fitness and damage, as long as a linear relation among both factors can be assumed: then, a horizontal line (slope = 0) indicates complete tolerance and a steeper slope indicates little or no tolerance (Fornoni and Núñez-Farfán 2003). However, the relation among fitness and damage is likely not linear in most cases and even in the few cases in which a linear relation can be assumed, data from many genotypes are required. Tolerance remains, thus, an essentially comparative value that is difficult to express in absolute numbers.

Another difficulty is that most studies employed clipping treatments rather than life herbivores. Only two out of more than 20 studies reviewed by Strauss and Agrawal (1999) did not rely on simulated herbivory and little has changed since then. Plant responses to clipping are, however, not likely to adequately represent how the plant behaves after natural herbivory. First, clipping is usually applied only once, whereas herbivory is normally a continuous process whose intensity increases with the successful development of the herbivore. Second, plants monitor general damage through ‘damaged-self recognition’, that is, the perception of chemical motifs that indicate a disintegrated plant cell (Heil 2009), while more specific herbivores are recognised via the perception of herbivore-derived elicitors or other herbivore-associated molecular patterns (Mithöfer and Boland 2008). When plants are clipped, few or none of these elicitors are set free and the plant is deprived of its major mechanism for the sensing of herbivory. Because tolerance mechanisms are mediated via signalling cascades similar to those that control resistance (Schwachtje et al. 2006) and respond to herbivore saliva (Zhang et al. 2007), clipping treatments are unlikely to elicit realistic plant responses.

Moreover, compensation or even overcompensation at the vegetative level does not necessarily increase fitness (Brody et al. 2007; Martinkova et al. 2008; Huhta et al. 2009) and negative effects on future reproduction represent a cost of compensation that is easily overlooked (Ruiz et al. 2006). Observations made solely during vegetative growth or over a single season are, thus, often not confirmed by studies that measure fitness more directly or over several reproductive phases.

Finally, the capacity of a plant to tolerate damage depends on its developmental stage and on the current environmental conditions (Wise and Abrahamson 2007). Ontogenetic stage appears particularly critical, as it affects the decision of whether and how to compensate for damage (Boege 2005), but abiotic factors also determine to which degree a plant can tolerate damage. The ‘limiting resource model’ considers (i) which resources are limiting plant fitness in the absence of herbivores and (ii) the effects of herbivory on the plant’s capacity to acquire and store limiting resources. This model makes good predictions concerning the capacity of a plant to tolerate herbivory under certain nutrient conditions. For example, growth responses of Scots pine (Pinus sylvestris L. nevadensis) to clipping depended on light availability and plant age (Hodar et al. 2008), whereas the capacity of Pimpinella saxifraga to tolerate flower herbivory or grazing was significantly impaired under resource-poor conditions (Huhta et al. 2009).

In summary, there is no doubt that plants to some degree can tolerate herbivory and that both specific anatomical traits and costly phenotypic plasticity underlie this capacity. Classical examples are grasses, which due to the basal localization of their meristems are well adapted to tolerate grazing, and the re-allocation of sugars to roots, which benefits future nutrient uptake and thus facilitates re-growth after herbivore damage (Schwachtje et al. 2006).

Tolerance has a genetic basis and therefore shows variability both within and among species (Núñez-Farfán et al. 2007). This genetic variability represents a prerequisite for the evolution of tolerance or any other type of phenotypic plasticity (Schlichting and Levin 1986). In fact, there is increasing evidence for herbivores expressing selective forces on tolerance (Strauss and Agrawal 1999; Juenger and Bergelson 2000; Agrawal 2005). Tolerance mechanisms and their effects on the evolution of plant-herbivore interactions need to be considered in order to understand how the phenotypic plasticity of a plant improves its survival rate and fitness in the presence of pathogens and herbivores (Roy and Kirchner 2000; Núñez-Farfán et al. 2007).

Preparing for future damage

Plants have evolved phenotypically plastic defences because enemy pressure is difficult to anticipate. Several cues might, however, allow plants to reliably pre-empt upcoming attack and three phenomena indicate that plants can indeed make use of these cues: (i) resistance induction by egg deposition (ii) plant-plant signalling and (iii) priming of resistance expression.

A concrete ‘sign of danger’ is the deposition of eggs onto the plant surface, since herbivorous larvae will most likely emerge from such eggs. ‘Early herbivore alert’ by insect egg deposition (Hilker and Meiners 2006) and resulting resistance induction has been described in trees such as elm (Ulmus minor) and pine (Pinus sylvestris), although the active compounds remain to be identified. By contrast, the resistance elicitors that are released from pea weevil oviposition fluid have been well characterised (Doss et al. 2000).

Which other clues can be used by plants to pre-empt encounters with their enemies? Attack to single plant organs usually means that other organs of the same plant and the neighbouring plants are at danger as well. Plants can, therefore, respond to volatile cues that are released from damaged neighbours. Although the first descriptions of ‘talking trees’ (Baldwin and Schultz 1983; Rhoades 1983) were heavily criticised, following studies by many groups convincingly showed resistance expression to herbivores that is induced by airborne cues (Heil and Karban 2010 and references therein). Much less is known about induced resistance to pathogens in intact plants by volatiles released from infected neighbours. Air from virus-infected tobacco did, however, successfully elicit systemic acquired resistance (SAR) to infection in intact plants (Shulaev et al. 1997), exposure to VOCs such as trans-2-hexenal, cis-3-hexenal or cis-3-hexenol enhanced resistance of Arabidopsis against the fungal pathogen, Botrytis cinerea (Kishimoto et al. 2005), and volatiles released from lima bean expressing SAR induced pathogen resistance in neighbouring plants (Yi et al. 2009). These observations make it likely that plant-plant communication commonly affects pathogen resistance as well.

As mentioned above, local attack of a single plant organ normally means that the rest of the same plant is at higher risk as well. Most plant responses to local damage are, therefore, expressed systemically, in as yet undamaged organs. However, local responses may suffice to kill the attacking insects or pathogens and herbivores eventually leave the plant for other reasons (such as, for example, wind, the presence of mating partners on other plants, an arrival of enemies that is not mediated by the plant and so on). A plant-wide resistance that is fully expressed after every local attack comes at the risk of an investment in a defence that is not required. Plants dispose, therefore, of another layer in their phenotypically plastic alarm response: priming, which prepares tissues to respond more rapidly and/or effectively to subsequent attack (Goellner and Conrath 2008). Interestingly, several volatiles prime tissues for faster responses to herbivory or infection and thus might also mediate their cost-effective preparation for the upcoming attack. In fact, self-priming by herbivore-induced volatiles has been described in the context of airborne within-plant signalling for lima bean, poplar and blueberry (Frost et al. 2007; Heil and Silva Bueno 2007; Rodríguez-Saona et al. 2009) and volatiles released from infected lima beans primed the expression of pathogenesis-related gene 2 (PR-2) in intact leaves (Yi et al. 2009). These observations suggests a two-step regulatory system in which self-priming by airborne signals prepares the systemic tissues for a rapid response, whereas full activation of resistance requires confirmation by a vascular long-distance signal or arrival of the plant enemy (Heil and Ton 2008).

Evolutionary benefits of plastic defence expression

All the changes that I have discussed above require that a first, inducing damage happens. Induced resistance strategies are, however, hampered by the intrinsic problem of a time-lag between enemy encounter and the activation of a functioning resistance. The existence of physiological tolerance mechanisms also demonstrates that plants ‘could do better’. Why do plants wait for attack before they start to defend themselves?

The evolution of induced tolerance could be favoured when herbivore pressure is so common that no adaptation to an herbivore-free environment can take place. For the evolution of induced resistance, mainly three explanations have been presented: the ‘moving target’ theory, fitness costs and signal reliability. First, counter-adaptations by plant enemies will evolve more easily when they are exposed to a constant selection pressure exerted by a constitutive resistance trait. Inducible strategies, which add a significant level of variability to resistance expression at the population level should, thus, reduce the danger that counter-resistant enemy genotypes evolve (Gardner and Agrawal 2002).

Second, monitoring enemy pressure and inducing resistance only when it is actually required appears a cost-saving strategy (Heil and Baldwin 2002) although being plastic per se can cause significant fitness costs (Van Buskirk and Steiner 2009). In fact, fitness benefits exerted by induced defences have been reported in the context of both, the resistance to herbivores (Agrawal 1998; Baldwin 1998; Heil 2004) and to pathogens (Heidel and Dong 2006; Traw et al. 2007). Costs can result from the allocation of limited resources to the resistance traits and then are termed allocation costs (Heil and Baldwin 2002), or from negative effects on other interactions of a plant with its environment, then being termed ecological costs (Heil 2002). Many studies have convincingly shown significant resistance costs. In consequence, the fitness of a plant under enemy-free conditions is likely lower when it expresses resistance as compared with a plant that does not express resistance (for reviews see, e.g., Heil and Baldwin 2002; Cipollini et al. 2003; Heil and Walters 2009).

A second, independent line of evidence for the relevance of allocation costs of resistance comes from the observation that plants can reduce resistance expression when being exposed to competitors. Because plants compete for light, water, space and nutrients, the successful anticipation of competition appears highly adaptive. Light passing through a canopy becomes enriched in far-red wavelengths. Plants perceiving these light cues ‘anticipate’ competition before they actually become shaded and tests with mutant cucumber plants demonstrated a trade-off between the shade-avoidance response and defence expression (McGuire and Agrawal 2005). Similarly, Nicotiana longiflora plants suppressed their resistance induction even under current attack when far-red light signalled the presence of competitors (Izaguirre et al. 2006). Plants use far-red sensing to monitor the presence of other plants and they are able to adjust their actual defensive efforts according to the presence of competitors.

While there is now overwhelming evidence for allocation costs of induced direct resistance much less is known on indirect defences. Physiological costs of VOCs are assumed to be low (Hoballah et al. 2004) and allocation costs of EFN secretion have as yet not been studied, although the observation that fertilisation increased extrafloral nectary numbers of Vicia faba provides indirect evidence for allocation costs of EFN (Mondor et al. 2006). Particularly those resistance traits that include the exchange of information among plants and animals can, however, also be used by other than the mutualistic arthropods and then cause ecological costs (Dicke 1999; Heil 2002). For example, herbivore-induced VOCs can be used by herbivores or parasitic plants as host-finding cues (see Heil and Karban 2010 for references) and Lepidoptera or other insects with herbivorous larvae can be attracted to EFN (Beach et al. 1985). However, I am not aware of a study that quantified the resulting fitness effects. In summary, reducing allocation costs by expressing resistance traits only when they are needed appears a convincing explanation for the phenotypic plasticity of direct resistance. By contrast, evidence for costs of indirect defences is fairly anecdotal.

Allocation costs do not appear to play a major role in the evolution of indirect defences but most of them are expressed only after herbivore damage. Which further factors might have favoured this general trend? In the case of VOCs, signal reliability appears one important aspect. VOCs are no resource per se but an advertisement for the presence of prey. Thus, the plant-predator mutualism would be evolutionarily unstable if plants would attract carnivores in the absence of herbivores. Parasitoids in particular learn quickly which signals to use for prey localisation (see above) and avoid unreliable signals. EFN, by contrast, contains water, carbohydrates and amino acids (González-Teuber and Heil 2009) and thus represents a valuable resource on its own. Still, most plants increase EFN secretion rates dramatically when they are damaged (Heil 2009 and references therein). Few insects live only on EFN, and most EFN visitors are predators that also search for insect prey. The reliable indication of inducing herbivores might, thus, be a factor that also favoured the evolution of EFN as an indirect defence.

Finally, as discussed above, reciprocal responses among phenotypically plastic partners appear a necessary prerequisite for the evolutionary stabilisation of mutualisms in general and, therewith, of indirect defence strategies (Agrawal 2001; Heil et al. 2009). Plants constitutively producing cues that signal the presence of herbivores to carnivores would ‘cheat’ their mutualists, while plants constitutively producing true rewards for defending animals without monitoring their defensive efficacy would be at high danger to be cheated themselves. The plastic expression of indirect defensive traits appears to be an evolutionary necessity.

Conclusions and open questions

Plants express phenotypic plasticity to cope with unpredictable changes in their environment. Resistance and tolerance are expressed only when required and accompanying large-scale shifts in the primary metabolism help to minimize the resulting effects on future growth. The level of plasticity that underlies these responses in illustrated by (1) the complexity of the underlying signalling cascades (2) the high number of genes of both the primary and the secondary metabolism that are up- and downregulated during a response to enemy attack (3) the phenomenon of priming, that is, epigenetic processes which prepare plants for future resistance expression and (4) the capacity of plants to suppress resistance expression when competition or resource limitation would make this response too costly. An even higher degree of flexibility is achieved by indirect defences, where phenotypically plastic reciprocal responses of plants and arthropods allow optimised adjustments of resource investment by the plant and of service provisioning by the animals. Particularly in this situation, the dramatic plasticity in the ‘extended phenotype’ of the plant (the proportion of the surrounding fauna that resounds to plant traits) appears a prerequisite for the evolutionary stabilisation of the defensive interaction, which otherwise could become exploited. Tolerance adds a further level of plasticity, as it allows plants to reduce the negative effects of damage once it has happened.

Several central questions remain, however, open and require more scrutiny if we aim at a complete understanding of how plants use phenotypic plasticity to cope with enemy attack. (1) Most studies have as yet focused on either tolerance or resistance, although both phenomena likely interact at several levels. (2) Second, most studies investigated plant responses only under subsets of those conditions to which plants in nature have to respond simultaneously. Other studies employed artificial damage rather than herbivory, thereby depriving plants of most of the signals used to monitor natural damage. Plants use numerous internal and external cues to monitor encounters with their enemies and competitors and they are usually challenged simultaneously by more than one enemy. Future studies will have to design more realistic setups that allow to study the—likely nonlinear—responses of plants to multiple threats and their consequences for the expression and evolution of phenotypic plasticity. (3) Third, most studies on indirect defence via tritrophic interactions have been conducted with highly simplified laboratory systems, which usually consisted of one species each of plant, herbivore and carnivore. Under natural conditions, VOCs and EFN are likely to affect dozens of animal species, which can be engaged in positive, neutral or negative interactions with the plant. Studies that are suitable to understand how far the induction of VOC emission or of the secretion of EFN can change this ‘extended phenotype’ of a plant are required to fully appreciate these changes and their potential effects on plant fitness. (4) Finally, although many physiological mechanisms of tolerance have been identified, the central question remains unresolved why tolerance must be induced although plants obviously ‘could do better’. What are the hidden costs of tolerance? Do plants ‘pay’ for the underlying re-allocation with reduced future fitness or reduced capacities to express induced resistance at the same time? Answering these questions will require long-term experiments in which plants that express different levels of tolerance are investigated in the undamaged and the damaged stage over their whole lifetime, in order to realistically quantify the net fitness of the plants.

In summary, much has been learned on the mechanisms that underlie plant resistance to biological enemies. Similarly, several physiological mechanisms have been identified that lead to the phenomenon of tolerance. We have the tools to investigate these phenomena at the mechanistic level and we now need to take these tools to the field and investigate to which degree these mechanisms affect plant survival, plant fitness and the interactions of plants with their competitors, enemies and mutualists, under ecologically realistic conditions.