Introduction

In many countries, the number of non-native, invasive pathogen and insect species continues to increase and they can have a significant negative effect on the health and biodiversity of native forest ecosystems, urban forests and forest plantations, which in turn can have large economic impacts (Lovett et al. 2016; Pimentel et al. 2000; Roy et al. 2014; Campbell and Schlarbaum 2014). Chestnut blight, Dutch elm disease, white pine blister rust, and emerald ash borer are examples of diseases and insects that have caused high mortality rates leading to ecologic and economic impacts in the United States. Although non-native pathogens and insects are particularly problematic to native tree species, there are also notable examples where native pathogens or insects can be responsible for high mortality or damage (Alfaro et al. 2013; Cubbage et al. 2000; La 2009; Zhang et al. 2010). In addition, commercial plantations of fast-growing exotic tree species such as eucalyptus are also now under increasing threat from pathogens and insects (Alfenas et al. 2012; Wingfield et al. 2008, 2013).

Once destructive non-native pathogens or insects have become successfully established, they can become permanent residents of native forest ecosystems. Utilizing naturally occurring genetic resistance provides a solution that fosters continuous coevolution of the affected tree species and the damaging agent that is vital for long-term success. However, in many cases management focuses on detailed monitoring of spread and searching for short-term solutions such as eradication or containment, with little early efforts to examine and utilize the genetic resistance that may be present and that could offer a more permanent solution in the cases where these other methods are unsuccessful. Genetic resistance has the advantage of being a natural alternative to the use of chemicals or other costly management methods that may have to be continuously repeated or may have detrimental side effects to the environment. Traditionally, resistance programs were developed primarily for tree species of commercial interest, but when the ecosystem benefit of trees and the associated services they provide in natural forests are threatened by insects and diseases, managers should consider such programs for non-commercial species as well.

Despite current high interest in developing genetic resistance in forestry, few programs have successfully been able to translate this interest into applied programs in forest trees (Yanchuk and Allard 2009; FAO 2015; Sniezko 2006; Sniezko et al. 2012a; Telford et al. 2015). However, knowledge can be gleaned from the small number of genetic resistance programs that have had operational successes (Alfaro et al. 2013; Alfenas et al. 2012; Schmidt 2003; Sniezko 2006; Sniezko et al. 2012b, c, 2014). Applied resistance programs will be successful only if they incorporate four major components: (1) research, (2) tree improvement, (3) planting, and (4) sustained management commitment (Fig. 1), with corresponding support from both policy-makers and the public. The advent of new technologies such as the development of genomic resources may pave the way to increased efficiency in implementing programs in new species when used in concert with applied breeding components. Presented here are some considerations and thoughts for organizations considering developing and/or implementing a resistance program.

Fig. 1
figure 1

Components necessary for development of a successful applied disease or insect resistance program in forestry

Necessary information: is there genetic resistance?

Forest health monitoring programs in many countries facilitate the detection of insect and disease epidemics at an early stage. This is the point where forest health professionals, managers and policymakers can begin to contemplate the full potential impact of a disease or insect into the future. Many of the early projections of a specie’s fate in an epidemic assume little or no genetic resistance. However, to fully understand the situation and consider all options, information on genetic resistance in the host species is critical. Investigation into whether genetic resistance exists, if there are different types of resistance and how frequently resistance is found in natural populations provides information on the potential full impact of the disease or insect epidemic over time, as well as the probability of implementing a successful genetic resistance program to mitigate damage or restore damaged populations. Susceptibility/resistance in tree species to a pathogen or insect can encompass a wide range of responses from extremely susceptible to degrees of partial resistance to complete resistance. Each system will be somewhat different and the types of resistance and their utility in the short-term and long-term can be weighed accordingly.

In native forests, monitoring as an epidemic progresses, or surveying once high mortality has occurred often identifies a very low percentage of trees of the host species that survive infestation (or are less affected in some way) by a destructive pathogen or insect. These trees are good candidates, but the resistant phenotype needs to be confirmed through a combination of short-term relatively quick screening methods followed by longer-term field evaluations (Koch et al. 2012; 2015; Koch and Carey 2014; Sniezko et al. 2014). To develop such screens for host resistance to disease, it is important to understand the biology of the disease including the role of any disease vectors or predisposing factors necessary for successful infection. Host resistance may be to the insects that vector the pathogen or whose feeding activities allow access of the pathogen into the host tree as is the case for trees that are resistant to beech bark disease (Koch et al. 2010). In the case of Dutch elm disease, resistance to the insect vector has been identified in some trees, while other trees have resistance to the fungal pathogen (Ghelardini and Santini 2009; Smalley and Guries 1993). Separate protocols are required to distinguish between these two phenotypes and decisions on whether to incorporate one or both phenotypes into the breeding program would be necessary. Often pathogens can be spread by multiple insect vectors complicating breeding for insect resistance, and in such cases screening for pathogen resistance is favored (Gibbs 1978; Juzwik et al. 2016). Screening for resistance to fungal or bacterial pathogens typically involves the development of an artificial inoculation technique to transfer the pathogenic agent directly into the tree tissues targeted for infection. Screening techniques range from direct injection of fungal spores or bacterial colonies, to transfer of infected tissue from a susceptible tree to a test tree, to higher throughput spore inoculations of foliage or roots of seedlings or rooted cuttings (e.g. Hansen et al. 2012; Sniezko et al. 2011). Protocols must be refined for each host/pathogen system.

Resistance to pathogens is often categorized as either complete (qualitative) or incomplete (quantitative) but the dichotomy between these is not always clear and much more detailed study is needed (Poland et al. 2009; Kovalchuk et al. 2013). Complete resistance is often the effect of a single major gene. It is relatively easy to screen for complete resistance to some fungal pathogens such as Cronartium ribicola (J.C. Fisch. in Rabh), that causes white pine blister rust, because it can often be done on very young seedlings and the ratio of canker-free (or surviving) individuals is often moderate to high versus the susceptible controls (e.g. Kinloch et al. 2003; Sniezko et al. 2012b). Unfortunately, single gene resistance can often be overcome by pathogens capable of rapidly evolving (Dowkiw et al. 2012; Kinloch et al. 2004; McDonald and Linde 2002). Quantitative resistance is the result of the actions of many different genes, and is therefore less likely to be overcome by evolution of the pathogen. The presence of quantitative resistance is sometimes masked when major gene resistance is also present, but some programs have successfully modified seedling screening assays to enable identification of the full range of resistance phenotypes. Such screening trials can provide clues about the variety of resistance responses present in a population and their potential inheritance (e.g. Sniezko et al. 2014). While resistance prevents infection or allows plants to limit pathogen growth and development, plants can also survive disease through tolerance of the damage caused by infection without impacting the pathogen (Miller et al. 2005; Horns and Hood 2012). Breeding for tolerance might also be considered (Schafer 1971), but trade-offs such as an increase incidence of the pathogen through the population need to be considered (Robb 2007).

Development of screening methods to identify host plant resistance to insects requires understanding the biology of the various phases of host-insect interaction as resistance can be manifested in a variety of different phenotypic traits that act at different points throughout the insect life cycle. These are typically broken down into (1) antixenotic traits that deter or repel insect herbivory and/or oviposition, (2) antibiotic traits, which reduce insect survival, fitness, and development and (3) tolerance, which refers to the ability of the plant to withstand or recover from insect damage (relative to susceptible plants) without negative impact on the insect (Smith and Clement 2012). Evidence of the presence of multiple types of host resistance traits have been reported in green ash trees that have survived in natural stands under long-term attack by emerald ash borer (Koch et al. 2015). Regardless of whether a pathogen or insect problem is being addressed, breeding strategies to incorporate all types of available resistance should be considered to ensure the development of durable resistance.

In the field, trees that appear to be phenotypically resistant may in fact be susceptible, and are simply ‘escapes’ due to stochastic environmental processes or random chance. Once screening methods are developed, controlled inoculations (or infestations) can be used to confirm resistance of candidate trees in natural forests through direct field assay (inoculation/infestation) of the mature tree or through testing of clonally propagated replicates of, or the seedling progeny of, candidate resistant trees. Screening seedlings can provide additional information on the type of resistance present and the mode of inheritance. Such screening methods can be used to search for sources of resistance in genetic field trials previously established for other purposes (e.g. tree improvement trials to examine variation in growth and adaptability). Although such field trials can be very useful, they are also limited by the range of genetic material, number of locations and other factors (Boshier and Buggs 2015). Systematic seedling screening, such as that used for major gene resistance (MGR) to white pine blister rust in Pinus monticola Dougl. ex D. Don (Kinloch et al. 2003), can be used to assess the frequency of resistance in the surviving trees or it can be used proactively to detect resistance and assess its frequency over the species range, ahead of the arrival of the disease agent (Schoettle and Sniezko 2007).

When short-term seedling trials indicate the presence of sufficient genetic resistance in the parent trees, seed collected directly from these trees can sometimes be immediately useful without further breeding (Fig. 2). The level of resistance of seedlings will depend on the type of resistance and its inheritance from the parent trees. If resistance is from one or two major genes of large effect, seed from parents is likely to have significant levels of resistance (Kinloch et al. 2003; Koch and Carey 2005; Sniezko et al. 2012b, 2014). However, in other cases where there is partial resistance, or the nature of the resistance is more uncertain, the level of resistance may or may not be high enough to use immediately. In these cases, breeding and advanced-generation seed orchards may be required to achieve the desired level and frequency of resistant progeny. Genetic tests, typically done by assessing the performance (resistance) of either open-pollinated progeny or progeny generated through controlled crosses using a specific mating design, can be used to assess the breeding value of the parents and the amount of genetic gain (Zobel and Talbert 2003).

Fig. 2
figure 2

Resistance testing to restoration planting of whitebark pine (Pinus albicaulis Engelm.) in 5 years. a Example of large variation among 12 seedling families (in 10-tree row plots) 23 months post-inoculation in resistance to white pine blister rust from a seedling trial at Dorena Genetic Resource Center, Oregon, U.S.A. 2-year old seedlings were inoculated with the blister rust fungus (Cronartium ribicola) in September 2013 and will be followed for up to 5 years to evaluate the level and types of resistance, b whitebark pine restoration planting at Crater Lake National Park, Oregon in 2016 (2009 planting) using results from earlier seedling inoculation trials

The term ‘durable resistance’ was originally defined for crop cultivars and was by its nature a retrospective assessment: ‘durable resistance to a disease is resistance that remains effective during its prolonged and widespread use in an environment favorable to the disease’ (Johnson 1984). Trees are perennial, long-lived organisms and thus durable resistance is needed to ensure the trees survive long enough to meet ecologic goals in natural forests, and economic or amenity goals in plantations or urban forests. When the goal is protecting or restoring forest ecosystems, resistant trees need to last for hundreds of years or more. Some types of resistance, e.g. MGR, is generally thought to be less durable than quantitative resistance; however, every pathosystem is different (McDonald and Linde 2002) and both types of resistance can potentially be useful.

Seedling or clonal inoculation trials can be a relatively fast method to examine a large number of genotypes for resistance, but they need to be followed up with long-term field trials to confirm the efficacy and durability of resistance under various field environments over time. Some longer term disease resistance field trials show encouraging results (Figs. 3, 4), but more extensive trials over longer time periods are needed to fully ascertain what types of resistances are durable in each case. Parent trees confirmed as resistant in seedling trials can also be used as sentinels to monitor changes in the efficacy of resistance.

Fig. 3
figure 3

The proportion of western white pine (P. monticola) trees from 12 families with stem symptoms over time in a field trial in western Oregon. Family 4 is a susceptible control, Families 11 and 12 have major gene resistance, but a virulent race of white pine blister rust pathogen that overcame this resistance is known to be present in this area (graph from Sniezko et al. 2012c)

Fig. 4
figure 4

Durability of resistance (survival) to Phytophthora lateralis over time for 16 Port-Orford-cedar families at Foggy Eden trial in western Oregon (graph adapted from Sniezko et al. 2012d)

Various forest tree species can often grow over a wide geographic range that includes large elevational spans of >2000 m, and thus experience widely different temperature and moisture gradients. The development of populations of trees with genetic resistance therefore also needs to retain the adaptive traits required by the species to thrive across vastly different ranges and environments (e.g. growth rate, drought and cold hardiness). This requires preservation of an adequate amount of genetic diversity in the production population and the absence of negative correlations between resistance and other adaptive traits. Due to the often very low frequency of genetic resistance to many of the damaging non-native insects and pathogens, hundreds or thousands of parent trees (or their progenies) might have to be evaluated to find enough resistant trees to maintain adequate genetic diversity. In addition, the geographic range of the species may need to be divided into breeding zones to produce trees capable of surviving local environmental conditions while retaining stable resistance across varied environments (e.g. Sniezko et al. 2012b; White et al. 2007). The number of resistant parents needed will be influenced somewhat by the program objectives, and by the species and the cycles of breeding that are anticipated to be necessary for achieving a suitable level of genetic resistance. In forest trees, there are usually 100–1000 trees selected per breeding zone for each initial breeding population and generally at least 20–50 trees in the production population (White et al. 2007).

Disease hazard can also vary over geographic ranges and different environments and at some sites there can be such extreme hazard for a disease such as white pine blister rust, that it would be difficult to successfully establish the species, especially in the early generations of breeding. Partial or quantitative resistance, although considered the best candidate for durable resistance, can mean that small, young seedlings get dozens or even hundreds of infections when planted in extreme hazard sites, and the small size of the seedling may make it more likely to die than larger trees with the same partial resistance. Defining site hazard will aid in developing reforestation or restoration strategies with resistant seedlings or clonal materials.

For many important forest tree species, common garden studies, including long-term field progeny or provenance trials, have been used in tree improvement programs to help delineate breeding zones and develop planting guidelines for adaptation. These existing (or new) trials can serve multiple roles, by allowing examination of inherent differences in disease development among populations in current conditions and monitoring changes over time and space. Such temporal data can provide information to validate site hazard (for pathogens) and insights that one-time surveys of different areas with differing genetic backgrounds will not.

Research and monitoring of long-term field trials and restoration plantings are required to provide data to confirm the durability and stability of resistance as well as the adaptability of resistant populations. Such field trials can also serve as invaluable permanent sentinels over the geographic range of a species to detect not only changes in efficacy of resistance, but also the spread or intensification of epidemics from the insect or pathogen and to monitor for other abiotic or biotic factors affecting forest health. Maintaining sufficient levels of genetic diversity in production populations will allow the species the best opportunity to continue to evolve and adapt to additional insect and disease outbreaks as they arise and even to novel environmental conditions brought about by a changing climate.

Beyond research: implementing an applied program

An applied resistance program for forest tree species can provide a long-term solution, but it can also be a long-term endeavor and it is important to consider this from the start. The biology of the tree species can be a major factor, especially if multiple generations of breeding will be involved because some species do not reach reproductive maturity for decades. Despite this, there are a small number of programs that have had proven successes in relatively short periods of time (Sniezko et al. 2012b; Koch 2010). Ultimately, society and program managers will need to weigh the value of the affected species and the resources that can be committed. Some programs will entail selecting many hundred to many thousands of trees to test for resistance, others will require extensive surveying efforts to find the infrequently occurring healthy surviving trees. From these, a subset of parent trees or progeny selections will be utilized for orchard development or further breeding to increase the level of resistance or combine different resistances. In addition, vegetative propagation (when feasible) can be used to preserve resistant parent trees in clone banks, since the rare and valuable resistant parent trees in the forest are subject to risk from fire and other events. Such clone banks can be managed as seed production areas to increase resistance through natural regeneration.

Unlike many agricultural or horticultural systems, in forestry the resistant product is usually not a single cultivar, but is the genetically diverse progeny that result from inter-mating of a population of resistant parents. Most disease or insect resistant plant materials for reforestation or restoration will come from seed collected either in seed orchards, or from parent trees in the existing native forests. Establishment of seed orchards usually means a time delay before resistant seed is available, from a few years to more than 20 years, depending on species. With careful orchard management, seed of some species can be produced within 5 years or less (Sniezko et al. 2012b). Grafted replicates of select parent trees from natural forests can often produce seed earlier than seed orchards derived from seedling selections (Koch and Heyd 2013). In some species, such as Port-Orford-cedar, [Chamaecyparis lawsoniana (A. Murr.) Parl.], prolific seed production can be accomplished very quickly and can be managed in containerized orchards (Sniezko et al. 2012b), but for many other species it will take longer and conventional field based orchards will be needed. Seed orchards typically have an advantage over seed collected from parent trees in natural forests because all pollination is from resistant parents, increasing the level and frequency of resistance in the resulting seed.

Sporadic funding makes it difficult to provide the continuity of staff and technical expertise required to successfully move a resistance program forward and provide the desired outputs. At the beginning of a breeding program it would be best to have an outline, even best case scenario, of what it takes to deliver outputs of a defined level and frequency of resistance for actual utilization. This would involve managers and others outside of research discussing realistic options. Too often, a research component is set up with little thought of whether the capacity exists once genetic resistance is developed for an operational program to produce resistant seed or seedlings of sufficient quality and quantity to be of use to the land managers. Involving the applied operations personnel at the outset would be beneficial. In the Pacific Northwest, the U.S. Forest Service’s Dorena Genetic Resource Center has been in operation for 50 years and acts as the crucial applied operational tree improvement link between basic research on resistance and the planting of resistant seedlings for several species.

To achieve efficiency in cost and effort, cooperation among various national or regional programs should be considered. Separate facilities and orchards will be needed in some cases, but joint research endeavors can often increase efficiencies and avoid some redundancies. Resulting cost savings can be reinvested to advance the applied tree improvement phase of the program. National forestry agencies often take the lead of long-term breeding programs, with universities carrying-out some of the necessary research. However, there are other successful models with universities playing key roles carrying out the long-term operational side of breeding. Geneticists and tree breeders are essential personnel in resistance programs, but pathologists or entomologists also play a vital role, particularly in the research phase, developing methods to distinguish various host phenotypes (hypersensitive reaction, ‘slow rusting’, etc. for pathogens and on the insect side, feeding and oviposition vs larval development) that can eventually be developed into higher throughput, systematic screening methods. This requires a significant understanding of the basic biology of the pathogen or insect and its interaction with the host tree.

From an organizational and management perspective, having the capacity to implement an applied resistance program is key to success. Resistance programs follow the same general protocols and require the same infrastructure and skilled professionals as operational tree improvement programs for growth and wood quality traits for species of commercial importance. It is important to note that in the last several decades the number of forest geneticists and tree improvement specialists in the United States have declined along with critical infrastructure such as nurseries and growing facilities (Wheeler et al. 2015; Campbell and Schlarbaum 2014). This decline in personnel and infrastructure will limit the ability to efficiently develop applied resistance programs in the future as new pathogens and insects threats emerge.

Resistance is often the only remaining solution after other management options have failed. The long-term solution genetic resistance provides does not require the use of insecticides and other chemicals or the release of genetically modified organisms in our forests. Programs for the development of genetic resistance for non-commercial species such as the high-elevation white pines, are a relatively new development, and research shows that public support exists for investing in maintaining healthy forests, including management programs aimed at improving the health of these forests in the presence of white pine blister rust (Meldrum et al. 2013). The demonstration value of field trials can be key to conservation education and garnering public support. A recently established combination restoration planting and genetic trial of whitebark pine (P. albicaulis Engelm.) at Crater Lake National Park (Fig. 2) is in a central area visited by hundreds of thousands of tourists and resource managers annually.

Restoration and reforestation considerations

The objective for restoration of threatened forest ecosystems is to increase population level resistance to a degree that ensures establishment of a self-sustaining population while preserving genetic diversity. Restoration or reforestation requires a ready supply of resistant seed of the desired species, and designated land appropriate for the species. Two of the main questions are (1) is the available level of resistance sufficient to meet land management objectives? and (2) are there sites of high or extreme hazard that should be avoided at this stage?

In most cases, the seed used to produce seedlings will be open-pollinated from parent trees in natural stands or seed orchards. The seed will be collected from numerous trees to help maintain genetic diversity and from seed orchards appropriate for different geographic areas or breeding zones. A common misconception is that all progeny of a seedlot will be resistant and survive, but this is usually not the case. Depending on the type of resistance (MGR or quantitative resistance) and the degree of breeding and selection that was performed prior to selection of seed orchard parents, each individual seedling resulting from a seedlot can vary from susceptible to highly resistant. However, the survival expected from these seedlots, or the population level survival, will be significantly higher than that from unselected parent trees in the natural forest. Planting densities should be adjusted in anticipation of levels of mortality and damage that are expected. For this reason, it is recommended that seedlots be tested to validate the expected level of resistance and genetic variation.

Even if resistant seed is available, another major challenge lies in the actual large-scale restoration of tree species across the portion of the range that has been impacted by invasive diseases or insects. The cost, time and logistics of such large-scale restoration can be daunting and problematic. A more likely scenario in these cases might be to have focal areas that emphasize restoration and serve as islands of resistance that can aid future natural regeneration. This strategy, also known as “applied nucleation”, has shown early promise in some restoration experiments (Corbin and Holl 2012). As with any other successful restoration program or attempts to control non-native invasive insects or pathogens, public awareness is often limited and efforts to educate the public should be undertaken to avoid conflicts and to garner support necessary for these activities (Poudyal et al. 2016; Stanturf et al. 2012).

The objective of commercial plantations and urban plantings is different from restoration of natural forests, but most tree plantings for reforestation or urban plantings will share the same expectation of long-term survival. However, there are differences such as the need for a higher level and frequency of resistance in commercial plantings since moderate mortality or stem damage can have adverse impacts on economic returns or the amenity value of the urban plantings. In commercial plantations of species with shorter rotations, there may be more risk taking, such as acceptance of lower levels of genetic diversity as a trade-off for the emphasis of higher growth rates or other economic traits. In some cases, such as eucalyptus species used in commercial plantations, vegetative propagation may be used and planting of resistant clones may be utilized when uniformity of economic traits is valued over genetic diversity (Alfenas et al. 2012; Gonçalves et al. 2013; Wingfield et al. 2013).

Successes to date

The following examples given from North America, Hawaii, and New Zealand are not meant to be an inclusive list, but are selected to provide an illustration of some of the successful resistance programs that can be found worldwide. It is important to note, that in most cases efforts are ongoing and evolving to further enhance genetic resistance in populations of these species:

White pines and white pine blister rust resistance (native tree, non-native pathogen)

Nine species of white pines are present in the U.S. and Canada and they are all susceptible to white pine blister rust (WPBR) (King et al. 2010; Tomback and Achuff 2010). To varying extents all nine species are currently being evaluated for genetic resistance or have operational resistance programs in place (Lu and Derbowka 2009; King et al. 2010; David et al. 2012; Sniezko et al. 2011, 2014). Some of these programs have been ongoing for 50 years and progeny of thousands of parent trees have been evaluated for resistance. The most extensive work in western North America is with western white pine (WWP), sugar pine (P. lambertiana Doug, SP), eastern white pine (P. strobus L., EWP) and recently with whitebark pine (WBP) and has resulted in the availability and use of resistant seed for reforestation and restoration (Waring and Goodrich 2012; David et al. 2012) (Fig. 2). The level of resistance available in the progeny of the most resistant parents of these species varies among the species, and within a species it can vary among geographic areas, but the frequency of resistance in the original natural populations is generally very low. Four of the white pine species have complete resistance from a major gene (Kinloch and Dupper 2002; Sniezko et al. 2016) as well as partial resistance that is presumably controlled by several to many genes, while in the remaining five species there may only be partial resistance (Hoff et al. 1980; Sniezko et al. 2008).

The most extensive data on the types, levels and limitations of resistance are available for WWP and SP (Sniezko et al. 2012c, 2014; Kinloch et al. 2003, 2004, 2012; Kinloch and Dupper 2002). Strains of the pathogen with virulence to the major genes for resistance in both sugar pine (Cr1) and western white pine (Cr2) have been documented in some areas (Kinloch et al. 2004; Kolpak et al. 2008; Sniezko et al. 2014). In essence, this means there has been a shift in allele frequencies in pathogen populations providing further caution about reliance solely on this type of resistance. Seed orchards provide much of the seed now being used for these two species and breeding to increase the level and mix of resistance (major gene and quantitative) in WWP and SP continues. A series of trials to more fully evaluate the frequency and level of genetic resistance in several of the other species is currently underway. In the U.S., the U.S. Forest Service takes the lead in developing genetic resistance to WPBR, but a wide range of partners and cooperators including other federal agencies, state and county groups, universities, tribes, private companies and individuals, as well as Canadian agencies have provided key assistance with aspects of the programs.

Testing for WPBR resistance in WBP in the Pacific Northwest part of its range began in 2002 (Sniezko et al. 2007). The WBP resistance programs have been able to capitalize on decades of extensive rust resistance work in WWP and SP, as well as trained personnel and existing U.S. Forest Service infrastructure from those programs. This has fast-tracked the operational evaluation of resistance in WBP. Fortunately, early seedling screening trials showed that some populations of WBP have much higher levels and frequency of partial resistance to WPBR than generally found in natural populations of WWP and SP. This relatively high level of resistance provided immediate opportunity for restoration plantings of WBP without having to wait for breeding or seed orchard production. Instead, seed could be directly collected from parent trees in natural stands that had produced the best performing seedlots in the seedling trials. The first restoration plantings of this species in the Pacific Northwest using the resulting resistant seedlings were established in 2006 and 2009 (Fig. 2).

WBP pine has been proposed for listing as ‘endangered’ under the U.S. Endangered Species Act (U.S. Fish and Wildlife Service 2011). The identification of genetic resistance to WPBR is one of the factors that recently led to a downgrading of listing priority number (LPN) from 2 to 8 for WBP in 2015 (U.S. Fish and Wildlife Service 2015), but the status of the species is reviewed annually. Restoration with resistant seedlings will increase the level of blister rust resistant WBP on the landscape in some areas but the high cost of restoration in these high elevation ecosystems, coupled with the current lack of approval to plant designated wilderness areas will preclude planting in many areas. Strategic planning will continue to be necessary to identify planting areas that may serve as focal areas or resistant islands allowing WBP to spread naturally throughout threatened high elevation ecosystems in future generations. In the areas with the highest levels of resistant parents, management activities that encourage natural regeneration of WBP may be successful in restoring populations where the WPBR pathogen is present. The parent trees noted as resistant will also serve to monitor the durability of rust resistance in the field.

Port-Orford-cedar and Phytophthora lateralis (native tree, non-native pathogen)

Originally, researchers had concluded that there was little or no genetic resistance to P. lateralis Tucker & Milbrath in Port-Orford-cedar (POC), but further inoculation trials identified some promising resistant candidate trees (Hansen et al. 1989), and further investigations confirmed the presence of resistance (Oh et al. 2006; Sniezko 2006; Sniezko et al. 2012b). The operational phase of the program to develop resistant populations of POC began in 1996. In the early stages of the applied program only MGR was thought to be present. However, some anomalies in the performance of some seedling families noted in the summary of data from early years of operational screening, led to modification of the protocol by extending the duration of the seedling evaluation period following inoculation. This modification helped confirm that quantitative resistance also existed. Rooted cuttings have also been used in greenhouse screening trials, but their results have been somewhat problematic, so the use of seedling families is preferred for POC (Sniezko, unpublished). This serves as a caution that although the use of vegetative propagation of identical genotypes in resistance testing has potential advantages, its reliability needs to be confirm in each system.

Due to funding, infrastructure, personnel and the biology of POC this inter-agency, inter-regional program has been one of the fastest moving applied resistance programs in forest trees (Sniezko et al. 2012b). Within a few years of initiation of the program, 1000’s of parent trees from throughout the range of POC were tested, resistant parents were identified and containerized seed orchards were established for a few breeding zones. The first resistant seedlots for planting were produced in 2000. In the first greenhouse seedling inoculation trials, the bulked orchard lots showed ~50% survival versus ~10% for random selections from the forest (Sniezko et al. 2012b). This resistant orchard seed is now being used by land managers in Oregon and California for reforestation and restoration. POC was once a highly valued species for urban plantings and the resistance now available may help re-establish its prominence as a landscape tree. The resistance program continues breeding efforts to increase the level of resistance in progeny selected for use as parents in advance-generation orchards and to increase the number of parents in some of the seed orchards that now cover 13 breeding zones. Field trials have been established to monitor the durability of resistance and its efficacy over a range of sites and early results are encouraging (Fig. 4, Sniezko et al. 2012d).

Port-Orford-cedar was another species of concern due to impacts of invasive disease, and was listed on the International Union for Conservation of Nature (IUCN) Red List of Threatened Species, given the species status of ‘vulnerable’ in 2000. The species status was downgraded to ‘near threatened’ as of 2013, with anticipation that it will be listed as a species of ‘least concern’ within 10 years, if current conservation actions, including planting resistant seedlings, are successful and maintained (Farjon 2013). The genetic resistance program, and the subsequent use of the resulting resistant seed, brings cautious optimism that resistant POC will meet the reforestation and restoration needs of land managers. As with the white pine blister rust resistance program there has been a wide array of partners and cooperators contributing to the current level of success by providing candidate trees for screening/testing, providing field sites for trials and clone banks, and providing a university seedling inoculation facility.

Sitka spruce and white pine weevil resistance (native tree, native insect)

A notable recent success with insect resistance is the development of Sitka spruce (Picea sitchensis [Bong.] Carr.) with resistance to the white pine weevil (Pissodes strobi Peck) (Alfaro et al. 2013). This weevil causes damage to the terminal leader and can cause significant stem deformation or loss of formerly vigorous trees to competing vegetation. Over the last 30–40 years, reforestation of Sitka spruce has declined significantly in Oregon, Washington and British Columbia because of extensive weevil damage in young plantations. Fortunately, supportive funding and research efforts over two decades have led to the development of successful screening techniques for resistance. Genetic studies confirmed significant gain in progeny of 88 different combinations of resistant parents and moderate (individual) to high (family) heritabilities for weevil resistance (Moreira et al. 2012). Seed orchards of resistant trees have been established in British Columbia and an increase in planting of Sitka spruce has taken place, including in areas of moderate and high weevil hazard (Alfaro et al. 2013). In British Columbia, a program to develop resistance to white pine weevil has also been successful in interior spruce (King et al. 1997). These spruce populations consist largely of hybrid swarms between P. glauca and P. engelmannii (De La Torre et al. 2014). A weevil resistant seed orchard has been established for interior spruce in Vernon, BC. This orchard currently has 53 parents (2230 ramets) and approximately 40 million seedlings are planted annually with seed from the orchard (Barry Jaquish, personal communication). In high hazard areas, seedlings from the current orchard show approximately 30% less weevil damage than wild stand seedlots, and the level of resistance is expected to increase significantly as roguing of the orchard continues and new resistant clones are added (Barry Jaquish, personal communication).

Loblolly pine and fusiform rust resistance (native tree, native pathogen)

Loblolly pine (P. taeda L.) is the most widely planted tree species in the U.S. (McKeand 2015; McKeand et al. 2003). The large economic impact from fusiform rust (caused by the fungus Cronartium quercuum f.sp. fusiforme) was the impetus leading to the development of genetic resistance breeding programs. Like some white pine species programs, this program has been ongoing for more than 50 years. The fusiform rust resistance program is led by university tree improvement cooperatives in the south, but the membership of industry stakeholders has been a key factor in its success. A central resistance screening facility managed by the U.S. Forest Service was established in 1972 and is utilized by the tree improvement cooperatives for the screening of seedling families for genetic resistance (Cowling and Young 2013). Substantial progress in breeding has been made (Sniezko et al. 2014) and nine Fr genes for resistance have been identified, with some evidence of at least four more (Amerson et al. 2015). Resistant seedlings of loblolly pine and other southern pines have now been planted on millions of acres (Schmidt 2003). Current research is focused on gaining a more complete understanding of the complex resistance and virulence in this pathosystem to help guide breeding efforts and seedling and clonal deployment in the widely planted loblolly pine (Amerson et al. 2015).

Pinus radiata and Dothistroma pini resistance (non-native plantation tree, non-native pathogen)

The Dothistroma ‘red band needle blight’, caused by Dothistroma pini, has attacked Pinus radiata (radiata pine) in New Zealand since arriving there in the 1960s (Bradshaw 2004). Wilcox (1982) established the existence of heritable resistance to Dothistroma. Subsequently, a project of field testing and assessment resulted in the identification and production of a seed orchard of radiata with improved Dothistroma resistance (Carson 1989; Carson and Carson 1986, 1989). Concurrent research and development efforts identified aspects of the mechanisms of resistance (Franich et al. 1986), and the gains available from early field screening (Carson et al. 1991). Forest growers with plantations in regions with high Dothistroma hazard, comprising roughly one-third on New Zealand’s 1.5 million hectares of P. radiata plantations, have since established stands with either open-pollinated or control-pollinated orchard seed for which Dothistroma resistance has been a primary selection objective. More recently, researchers have screened and developed production clones of radiata pine with high levels of Dothistroma resistance (Carson et al. 2015) and these are being increasingly planted in commercial forests in New Zealand’s Central North Island region.

Koa and koa wilt resistance (native tree, origin of pathogen unknown)

The successful restoration and reforestation of Acacia koa Gray, an ecologically, culturally and economically important species in Hawaii is negatively affected by the pathogen Fusarium oxysporum f. sp. koae, cause of the vascular wilt disease of koa (koa wilt). An applied program to develop genetic resistance was started in 2003 and has made substantial progress (Dudley et al. 2015) including the establishment of the first seed orchards, delineation of 11 provisional seed zones, and release of the first seed (with confirmed levels of resistance) for reforestation and restoration. Data from seedling screening trials and the first field trials suggest survival on infected sites may be expected to exceed 60% in plantings using seed from the best parents compared to 30% or less survival in unimproved control seedlings. This rapidly developing resistance program will need continued monitoring of trials and plantings to further evaluate the durability of resistance. Since 2011, resistance screening in five seed zones has been completed and the establishment of seedling seed orchards is currently underway. Additional seedling screening to identify additional parent trees will be needed for all seed zones to increase genetic diversity. Research to better understand the resistance phenotype and genetic studies to estimate and understand inheritance of resistance could be of benefit to this program. The rapid progress in the koa wilt resistance program was facilitated by an applied focus at an early stage, the availability of forest genetic and pathology technical support from other programs, an in-place facility for tree improvement activity, and funding support from state and federal sources.

American beech and beech bark disease resistance (native tree, non-native insect, native pathogen)

Beech bark disease has been killing American beech (Fagus grandifolia) trees in North America since the 1890’s (Ehrlich 1934). The disease is initiated by feeding of the invasive beech scale insect (Cryptococcus fagisuga), that causes the development of small fissures in the bark, providing entryway for bark canker fungi (most commonly Neonectria faginata Castlebury or Neonectria ditissima Samuels and Rossman, Castelbury et al. 2006) whose many cankers may coalesce as they grow, weakening and even girdling the tree. By the 1980’s beech trees with resistance to the beech scale insect had been identified (Houston 1982, 1983), but the program to develop genetic resistance in American beech was not started until 2002 (Koch and Carey 2004, 2005). Trees with resistance to the insect provide protection to the tree because without the scale insect, there is no point of entry for the fungus and infection does not occur. Genetic studies screening full and half-sibling families for resistance to the scale insect suggested the involvement of as few as two genes and demonstrated the highest gain when both parents were resistant (Koch 2010; Koch and Carey 2005). Methods to efficiently vegetatively propagate resistant parents were developed (Carey et al. 2013) and, a collaborative, multi-agency effort has resulted in the establishment of five regional American beech seed orchards with four others in progress (Koch 2010; Koch and Heyd 2013). Similar to other successful programs, the partners in this effort were the key to success, providing a cost-effective pipeline for identifying resistant parent trees. State and National Forest personnel surveyed natural forests for candidate trees and then received training to test each tree by setting up an egg bioassay in the field (Koch and Carey 2014). A genetic linkage map has been constructed and a genome wide association study has identified markers associated with resistance that may help further expedite breeding efforts (Irina Calic and David Neale, pers. comm.).

New diseases and insects will come: resistance breeding to the rescue?

New diseases or insects will continue to impact trees. One of the newest, is the ‘rapid ‘ōhi'a death’ first documented 5 years ago affecting ‘ōhi'a (Metrosideros polymorpha Gaudich.), Hawaii’s most common and widespread tree. The potential loss of such a keystone species could be an ecological disaster for Hawaii. The pathogen responsible has been identified as Ceratocystis fimbriata (Keith et al. 2015). At this early stage, little or no information is available on genetic resistance, but research is underway. The approaches used in the successful resistance programs featured here could be used to greatly increase the current knowledge base and provide potential options for land managers. Another prominent example is Emerald ash borer (EAB), which has already killed hundreds of millions of ash trees in North America. In 2007, EAB was found in Moscow, Russia, and is predicted to continue to move throughout South Central Europe, where the European common ash (Fraxinus excelsior) is already threatened by ash dieback disease caused by the fungal pathogen Hymenoscyphus pseudoalbidus (Straw et al. 2013; Valenta et al. 2015). Research has shown that genetic resistance to this pathogen exists (McKinney et al. 2014) and there is preliminary evidence indicating that some genotypes of F. excelsior may be less susceptible to EAB (J.Koch, personal comm.). Researchers are optimistic that resistance to both of these threats can provide the solution to help restore ash as a prominent species (McKinney et al. 2014). The key will be to take it beyond research phase and to the restoration phase.

Promising new technologies and common misconceptions

Breeding can be a long-term process in forest trees, with some species taking a decade or more to become reproductive. Not surprisingly, several avenues of research are aimed at trying to reduce the amount of time it takes to breed for desired traits such as resistance. For example, manipulation of cultural conditions to induce early flowering or overcome seed dormancy can contribute to reducing the breeding cycle of many woody plants, including forest trees (Van Nocker and Gardiner 2014). High levels of phosphorous have been shown to promote early flowering and increase the number of flowers produced in chestnut and high intensity, high dose light treatment can induce flowering in American chestnut by 6 months of age (Baier et al. 2012). The application of endogenous gibberellins, notably GA3, has been used for many years to induce strobili production in some conifers (Pharis and Kuo 1977), and has been used extensively to enhance the breeding efforts in very young Chamaecyparis lawsoniana for the Phytophthora lateralis resistance program (Elliott and Sniezko 2000). Transgenic approaches to manipulate flowering have been developed in many woody plants and because the transgene effect of early flowering is dominant, it is only needed in one parent, and can be selected against in the progeny so that the final selected genotype is not transgenic (Van Knocker and Gardiner 2014). An exciting new technology using viral vectors containing genes that control flowering has been successfully used to promote early flowering in both apple and pear trees (Yamagishi et al. 2016). In this approach, a simple heat treatment prevents most of the resulting seed from carrying the virus. Successful application of this technology in forest trees, has the potential to reduce breeding cycles from several decades to a few years.

Molecular markers such as SSRs (simple sequence repeats) and SNPs (single nucleotide polymorphism) are already important tools that can improve efficiencies of conventional breeding programs through a variety of applications including the evaluation of genetic diversity in breeding populations, the confirmation and tracking of identity, parentage and relatedness, and the assessment of pollen flow/contamination in seed orchards (Neale and Kremer 2011; Porth and El-Kassaby 2014; Liu et al. 2016). Advances in high throughput sequencing technology have made it relatively inexpensive to now obtain millions of markers dispersed throughout the genome. Sequenced based genotyping methods (whole genome resequencing or genotyping by sequencing) provide high throughput genotyping capabilities at low cost. The ability to achieve such dense genome coverage can provide information on genetic variation relevant to a desired phenotype through the development of indirect selection techniques, including marker assisted selection (MAS) and genomic selection (GS). These two tools have the potential to streamline the conventional breeding process by allowing the breeder to use markers to “pre-select” trees at a young age or to directly select parent trees from natural stands. This pre-selection will help by minimizing the number of trees whose phenotypes will need to be carefully confirmed over a range of time and environments through bioassays and/or field plantings, therefore lowering both cost and time investments in conventional breeding programs.

Markers linked to traits for use in traditional MAS are identified through the development of a genetic linkage map which relies on analyzing patterns of segregation of markers from parents to progeny to identify the number of linkage groups and place markers in an ordered fashion on each group. Markers that are closer together are more likely to be inherited together in the progeny. QTL (quantitative trait locus) analysis identifies markers inherited with a trait of interest, indicating that the marker was closely located to the region(s) of the genome (QTL), which influence the expression of the trait. Despite an enormous amount of investment in work identifying QTLs and associated markers, very few have been validated for usefulness for indirect selection of a desired trait and implemented within a breeding program for either crops or trees (Xu and Crouch 2008; Isik 2014; Muranty et al. 2014). A recent literature search found that the majority of papers published that used the words “marker assisted selection” in the text were actually reporting on QTL mapping studies whose findings have the potential to be developed into applied MAS should they be validated (Xu and Crouch 2008). However, the lack of validation is among the largest hurdles to translating QTL analysis into operational MAS (Neale and Kremer 2011). Many QTL studies in forest tree species have been done using small families, which means that mapping precision is not high and the effects of the QTLs identified are likely to be over-estimated, but the lack of validation makes it difficult to estimate a false discovery rate. A lack of genetic resources such as additional populations (especially when a breeding program is lacking) may be contributing to the lack of validation as it would allow validation through comparative mapping studies. Other reasons behind the lack of implementation of MAS include a lack of funding to support initial development and a lack of a perceived cost benefit (Xu and Crouch 2008; Ru et al. 2015).

Partial or quantitative resistance is a highly complex trait attributed to many QTLs of small effect. QTL mapping can identify multiple loci involved in resistance, which requires the development of MAS for each individual locus. Each locus may have only a small effect, and many individual loci of small effect may not be detected at all. So even if MAS is developed and implemented for each identified locus, the desired level of resistance may not be achieved. In addition, markers identified in one mapping family may not be transferable to another because they are often only closely located to the genes responsible for the trait of interest and are not actual causal polymorphisms found within the genes (Nilausen et al. 2016). Fine mapping and using markers based on highly conserved sequences, such as EST-SSRs (expressed sequence tag) can improve transferability across populations, but can increase up-front costs. The programs that do employ MAS in trees are limited to domesticated fruit and nut producing species and are typically tracking a single locus, or a small number of loci with very large effect, including major gene resistance (Sathuvalli et al. 2011; Ru et al. 2015). A common use of MAS in crops and fruit trees is pyramiding multiple major effect resistance alleles, which is difficult using traditional phenotyping methods (Ru et al. 2015; Muranty et al. 2014). In tree species where pathogens such as white pine blister rust have been documented to overcome single gene resistance, breeding programs are currently focused on combining both partial and quantitative resistance to develop durable resistance. A potentially helpful application of MAS in such programs would be for within family selection when the goal is pyramiding major genes for resistance with quantitative resistance. A recently identified SNP in WWP was confirmed to be highly efficient for MAS of seedlings that are either homozygous or heterozygous for the WPBR Cr2 resistance gene (Liu et al. 2017) and has the potential to be applied to gene pyramiding breeding strategies. This finding also highlights the success of integrated genomics approaches to develop tools to accelerate breeding, when done in conjunction with the genetic resources an established breeding program can provide.

Given the lack of breeding programs for many forest tree species and the genetic resources they provide, there has been considerable interest in using genome wide association studies (GWAS), also known as linkage disequilibrium mapping, to identify associations between markers and traits (such as resistance) across large natural populations. Linkage disequilibrium refers to the nonrandom association between genetic markers (alleles) at different loci. The closer two loci are located to each other physically, the higher the likelihood they will be in linkage disequilbrium within a mapping family or a large natural population. Theoretically, genome wide association mapping would capture recombination events that have occurred throughout the evolutionary history of the population, instead of being limited to the single generation of the parents and the traditional mapping population. This approach relies on large population samples sizes and a very high density of marker coverage. The phenotype of each individual must be known, and one drawback of using natural populations is that phenotypes are often simply assessed based on the health of the individual at the time of sampling (a single time point) and therefore lacks the power of replication and repeated observation over time and in different environments which often is part of the phenotyping process of breeding programs. The ability to measure the amount of true phenotypic variation in a population is critical for the accurate detection of associations (Ingvarsson and Street 2011). Another potential problem with GWAS is the confounding effect caused by unrecognized population structure or relatedness. Although GWAS is a promising approach to identifying markers associated with resistance, to date very few of the genotype-phenotype associations found in plants have been verified in independent studies. Strict guidelines for experimental design, including replication and validation of association genetics studies, are being devised in the realm of human genetics (Chanock et al. 2007) and will be needed in plant studies as well, before GWAS can be translated into applied breeding and management (Calic et al. 2016; Ingvarsson and Street 2011).

Genomic selection (GS) methods have been developed that do not require identification of specific marker-trait associations or estimation of relative effects of individual QTLs on the trait of interest. Instead, GS relies on phenotyping and high density genotyping of such a large sample of the breeding population (the training population) that the majority of loci that contribute to a quantitative trait are closely located to one or more markers. The effects of all markers are then estimated simultaneously (unlike MAS) and used to predict genomic breeding values in a test population without the need for phenotypic data (Resende et al. 2012; Jannink et al. 2010).

Although GS is routinely used in dairy cattle and other animal breeding programs (Hayes and Goddard 2010) and has been successful in crop breeding programs (Lorenzana and Bernardo 2009), in forest trees such strategies have only been tested in simulation studies (Grattapaglia and Resende 2011; Iwata et al. 2011) and preliminary trials (Zapata-Valenzuela et al. 2013; Resende et al. 2012). Even though these studies in forest trees show promise in smaller populations, results should be cautiously interpreted and additional proof of concept studies are needed (Plomion et al. 2016). Successful application of GS to forest tree breeding will ultimately require the correlation of thousands or even hundreds of thousands of markers with a desired phenotype. This can only be accomplished using sufficiently large training populations that have undergone at least some breeding and have been carefully phenotyped for the desired trait, resources that only the most-advanced and well-funded forest tree breeding programs can produce (Jannink et al. 2010; Isik 2014; Zapata-Valenzuela et al. 2013). Analysis of the economic viability of incorporating GS is needed (Plomion et al. 2016).

When new invasive threats are identified, immediate funding is directed at efforts to delineate and eradicate the insect or disease. In cases where eradication is not successful or not possible from the beginning, funding is directed at the development of management strategies which may include host resistance (Campbell and Schlarbaum 2014). However, funding for host resistance is in some cases more heavily weighted to basic research on the underlying mechanisms of resistance, and not on efforts to develop a breeding program for resistance. One reason for the disproportionate level of funding for basic research over applied breeding is the decline in federal, state, industry and university forest genetics and tree improvement capacity, including long-term funding in the United States. In order to sustain long-term breeding activities, scientists have had to rely on piecing together short-term grants. Such short-term grants generally favor basic research studies that can be carried out within the grant cycle, and are not designed to develop and sustain long term breeding programs (Wheeler et al. 2015).

This funding gap can result in a delay or complete absence of pedigreed families with well characterized phenotypes that are most appropriate for basic research aimed at delineating mechanisms of resistance. To compete in a short term granting cycle, researchers sometimes have no choice but to rely on trees readily available from commercial sources that lack known genetic structure and diversity. Unfortunately, even the extensive amount of data that can be generated by attractive new ‘omics’ technologies (genomics, transcriptomics, proteomics, metabolomics) can’t overcome limitations of experimental design (Zivy et al. 2015).

As an example, we can look at the types of resistance research that have been done since the invasion of the emerald ash borer was discovered in 2002, one of the most devastating invasions to urban and natural forests in the United States in recent history. Since 2003 there have been eight Emerald Ash Borer Research & Technology Development Meetings. Using the cumulative meeting proceedings to estimate research activities, approximately 7% of research during this time period was devoted to host resistance. Of the host resistance research, 61% was basic research to identify resistance mechanisms, 14% was aimed at the use of transgenics to develop resistant ash, and only 7% (0.5% of the total research) was on breeding for resistance (Mastro et al. 2004; 2005a, b, 2007, 2008; Lance et al. 2010; Parra et al. 2011; Buck et al. 2015). Without available genetic resources, basic research focused on using ‘omics’ techniques to compare constitutive expression levels of genes (transcriptomics), proteins, and metabolites between a commercially available cultivar of an EAB-resistant ash species and several different susceptible species, each represented by either a single cultivar or seedlot (Eyles et al. 2007; Bai et al. 2011; Whitehill et al. 2011; 2014). Genes, proteins or phenolic compounds that were present at higher levels in the resistant cultivar relative to the susceptible cultivars were identified as having potential involvement in resistance with the hope that they could be developed into useful biomarkers for resistance in support of breeding programs or the development of transgenic resistant ash. The functional role of these identified candidate resistance compounds, genes and proteins have not yet been validated, nor have they been confirmed to have utility for selecting for resistance in a breeding program. A recent review of the limitations of transcriptomics emphasized the need for validation, concluding that transcriptomics alone is not an effective way to identify candidate genes associated with specific phenotypes of interest for a variety of reasons including that mRNA abundance does not always correlate to protein abundance, activity or function (Feder and Walser 2005). As research expanded to include additional species, phenolic compounds initially reported to be specific to an EAB resistant species (Eyles et al. 2007) were found in a susceptible species indicating that the variation in these compounds was most likely due to evolutionary divergence and not related to differences in EAB resistance (Whitehill et al. 2012). This highlights the limitations of studies using limited sample size and diversity when the carefully phenotyped, pedigreed families of a breeding program are not yet available.

Identifying, validating and translating candidate biomarkers into useful tools for breeders is a long term undertaking and should not be viewed as a short cut or a replacement for traditional breeding. In fact, none of the breeding programs presented above as examples of success used any sort of high tech tool, relying instead on traditional phenotyping methods. Careful consideration of the desired downstream applications of basic research is warranted, especially when a fully funded breeding program is lacking. If the goal is to develop sources of resistant material for reforestation, the development of a breeding program should be the first priority. If the desired end result of basic research is the development of indirect selection strategies to accelerate breeding, it would be most efficient and more likely to succeed if it was conducted within the context of a breeding program. Attempting to launch a resistance breeding program by starting with a basic research program to identify mechanisms of resistance and translate them into useful tools for breeding is comparable to putting the cart before the proverbial horse. Although such research can be helpful, the success of a breeding program does not require knowledge of the underlying molecular basis of resistance. Instead, the genetic resources developed by a breeding program should serve as the focal point that drives successful basic research and provides immediate avenues for validation of basic research findings.

Another driving force behind investing in basic research over breeding programs may be rooted in the common misconception that the results will identify genes that can be used to develop a transgenic plant, which is often promoted as a quicker, less expensive route than breeding. However, Strauss et al. (2010), estimated the time to develop a transgenic crop and meet the regulatory requirements required for commercial release to be between 15 and 27 years in the U.S. Morever, this estimate was based on using a gene that was previously identified and successfully used in the development of another transgenic plant, so it does not include the time devoted to the fundamental research to identify and characterize the function of the gene. In crop plants a survey of the 6 top companies reported a timeframe of 7 to 24 years with an average of 13.1 years to discover, develop and obtain regulatory authorization for a new transgenic plant at an average cost of $136 million (McDougall 2011). Once a transgenic plant is successfully developed, resistance to insects and plants based on a single transgene can still be overcome (Tabashnik et al. 1990; Metz et al. 1995; Johnson et al. 1978). This potential lack of durable resistance is especially problematic in forest trees which will be in place for decades or much longer, and not replaced annually as crop species are. Development and deployment of a transgenic forest tree will undoubtedly require at least a similar level of investment of time and money and face similar technical hurdles, possibly even more due to potential challenges of achieving sustained expression of a transgene throughout various life stages over the long lifespan of a tree. Despite decades of significant investments of the forest industry in research on the development of genetically modified trees, no transgenic forest trees have yet received regulatory approval for commercial plantings in North America. The steep costs involved in overcoming technical hurdles and gaining public acceptance and regulatory approval have caused a dramatic reduction in this field of research (Wheeler et al. 2015).

Of the few remaining research programs in the U.S. that continue work to develop transgenic forest trees, many are focused on developing resistance to invasive insects and diseases (Bo et al. 2013; Palla and Pijut 2015; Merkle 2016). American chestnut [Castanea dentata (Marsh) Borkh.], once a prevalent dominant canopy tree in the eastern U.S. has been reduced to existing only as a seedling or shrub, by the fungal pathogen Cryphonectria parasitica, a pathogen that is so pervasive and lethal that it is a rare example of a case when a traditional resistance program alone may not be able to provide a solution for restoration (Steiner et al. 2016). Instead, a breeding program focusing on hybridizing with resistant Asian species of chestnut was undertaken (Burnham et al. 1986) and over the past decade significant effort has been focused on using genetic transformation to develop a resistant transgenic American chestnut. While transgenic refers to movement of a gene between unrelated species, cisgenic refers to movement of a gene between species closely related and capable of interbreeding and is therefore perceived to have a higher rate of acceptability by the public. Genetic modification of American chestnut has been successfully performed to introduce both cis- and transgenes. Initial reports of American chestnut expressing a cisgenic candidate gene for resistance indicate only moderate levels of blight resistance were achieved, while the best results were obtained with a transgene from wheat. The researchers suggest that pyramiding multiple cisgenes and transgenes may be required to achieve an acceptable level of resistance, due to the quantitative nature of resistance (Nelson et al. 2014). In addition, breeding the best transgenic trees with the best trees resulting from the American Chestnut Foundation’s 33 year old hybrid breeding program could provide a way to stack diverse resistance genes, increasing the chances of producing durable resistance (Powell 2016). Once an acceptable level of resistance is achieved with either a single transgene or combinations of several genes in chestnut or any forest tree species, it will still be necessary to incorporate genetic diversity to maintain adaptive capacity to multiple stresses and environments (Steiner et al. 2016). This can only be accomplished through incorporation of such a transgenic tree into a traditional breeding program, meaning that even once successful development of a resistant transgenic plant has been achieved, it will still be necessary to invest time, resources and infrastructure in an operational breeding program. Clearly, transgenic technology alone cannot provide a replacement for traditional breeding programs, but may provide a way of supplementing existing programs. In fact, successful deployment of a transgenic for restoration purposes will be dependent on the existence of a breeding program with the exception of possible industrial forestry plantations or horticultural cultivars where it may be appropriate to deploy material with limited genetic diversity.

Genome editing or genome engineering refers to the direct manipulation of sequences in the genome in such a way as to alter gene expression and traits related to patterns of gene expression. With new genome editing technologies such as the (CRISPR)-Cas9 system, gene sequences can be deleted, inserted or replaced and can target multiple genomic sites at once making it a particularly powerful tool (Kushalappa et al. 2016; Puchta 2016). These technologies are now being touted as a potential tool for fast-tracking the development of resistance through genetic engineering (Kushalappa et al. 2016; Haggman et al. 2016), but even in crop species it is in its infancy (Scheben and Edwards 2017). To use genome editing as a strategy to develop resistance, extensive knowledge of the gene networks and regulatory elements responsible for the resistant phenotype is required in order to determine what genes to edit and what sequences to target. Such knowledge is often lacking, even in crop plants (Scheben and Edwards 2017), and would require a significant amount of research that would benefit from using the genetic resources of a breeding program. Clearly, a system such as (CRISPR)-Cas9 that can perform directed mutagenesis of specified areas of the genome, is a valuable tool for studying gene function and regulation of such candidate genes. Should genome editing of targeted candidate genes ever result in the development of a tree with resistance to an insect or disease, just as is the case with transgenics, it will not replace traditional breeding programs but instead will depend on them to incorporate the genetic diversity needed for operational deployment. Genetic modification of trees either through transformation (transgenics) or genome editing should be thought of as tools for basic research on the study of gene function that may also be used to supplement or enhance a breeding program (assuming regulatory approval), but they should not be pursued in lieu of a breeding program if the goal is restoration.

Tree species will continue to face attack by insects and pathogens. Genetic resistance provides a tool to manage these threats and maintain forest health. Rapid response to identify and preserve rare resistant trees as well as general genetic conservation efforts are sometimes of extreme importance when trees are threatened by invasive species. Once these valuable resources are lost, they cannot be replaced. Long-term strategic planning is critical to success of resistance programs. It is important to remember that most, if not all, successful resistance programs in forest trees to date have not utilized any of the high technology tools but instead relied wholly on traditional tree improvement methods. For this reason, decisions to allocate resources to tool development should be carefully thought out, taking into account a cost/benefit analysis, probability of success, linkage to operational breeding program and estimated time until fully implemented.

Summary

Insects and pathogens will continue to cause damage and mortality to the world’s forests. Natural genetic resistance represents a first line of defense for trees, but it is often rare when non-native insects or pathogens are the damaging agents. The development of resistance is a management tool that, used effectively, can provide a solution, but too often its use is constrained while other options such as eradication or containment of the pathogen or insect are the primary focus. In many cases, an earlier, more focused effort on finding, evaluating and implementing the applied development of resistance would circumvent the delays in restoring and maintaining healthy forests. Initial research focused on ascertaining whether genetic resistance to the damaging agent is present in the affected species, and developing the basic screening technologies to assay populations would help fast-track the development of applied breeding programs. Such work can also help provide the initial pedigreed and phenotyped families needed to facilitate the utility of basic research on mechanisms of resistance which in turn might be successfully translated into useful tools to further accelerate the breeding program. Traditional resistance breeding and tree improvement is the path to developing resistance as a management tool, with or without new technologies. New programs need to build on lessons learned in past and current successful programs, as well as look at new ways to increase efficiency. New technologies, when fully developed and validated, offer potentially dramatic avenues to increase efficiency in breeding programs if they are developed within the auspices of a breeding program.