Western white pine (Pinus monticola) and sugar pine (P. lambertiana) are the dominant white pine species in western North America. They have the largest trees, cover greatest area, and produce the most valuable timber. Both have been severely damaged in most parts of their ranges by white pine blister rust (caused by Cronartium ribicola), introduced into a nursery in Vancouver, British Columbia, early in the last century. The disease then spread inexorably, south into the Coast and Cascade Ranges and east into the northern Rocky Mountains, reaching the southern Sierra Nevada in the 1960s. It continues to spread; the history of the epidemic is well documented (Mielke 1943; Smith 1996; Malloy 1997).

Because of their high ecological and economic importance, both species were subjects of early genetic programs started in the Northwest after World War II to improve resistance through selection and breeding. Progress has been summarized periodically (Bingham 1983; Kinloch and Byler 1981; Kinloch and Davis 1996; McDonald et al. 2004).

The need for understanding genetic interactions between different rust populations and different resistance mechanisms in the two host species for prudent deployment of resistance genes was recognized early on. In 1969, R.T. Bingham, who began the selection and breeding work on western white pine at the U.S. Forest Service’s Intermountain Forest and Range Experiment Station at Moscow, Idaho, designed a study to test these interactions, entitled “Distribution of White Pine Blister Rust Resistance Genes and Rust Races in Forest Service Regions 1, 5, and 6”. The study was implemented through reciprocal exchange of host materials among the three Forest Service regions having primary responsibility for management of western white pine and sugar pine. Area coverage for these species were: Region 1, Rocky Mountains of northern Idaho and eastern Washington, western white pine only; Region 5, Siskiyou Mountains and Sierra Nevada of California, sugar pine only; and Region 6, Cascade Range and Siskiyou Mountains of Oregon and western Washington, both western white pine and sugar pine. The idea was to challenge a small group of families from known parents highly selected for resistance in each region to the native inoculum in their own and the other two regions, and look for possible genetic differences in the way the local inoculum from the different Regions interacted with the same host genetic material. Selections were chosen to be representative of the main types of resistance recognized in operational programs at the time. These included major genes, both dominant and recessive, that appeared to confer virtual immunity, as well as more complexly inherited kinds and levels of partial resistance (PR; also known as “slow rusting resistance”, and designated “low-level resistance” in Bingham’s study plan). Artificial inoculation of 2-year-old seedlings was anticipated in the original plan, but as facilities for this were not available in California at the time, a field “disease garden” approach, already being used effectively to progeny test sugar pine candidate selections (Kinloch and Davis 1996), was taken. In the event, this was a fortunate decision because poor survival, especially of sugar pine, in the other two locations compromised those tests, and data were never published. In this paper, we report results of almost 30 years of observations on the performance of these families of both species in the field at a single location in California.

Materials and methods

Parents were selected based on progeny performance in previous tests in greenhouse, nursery, or field inoculations. Major gene resistance (MGR) had been implicated (and subsequently demonstrated) by Mendelian segregation in certain sugar pine and western white pine families from California and Oregon (Cr1 and Cr2, respectively; Kinloch and Littlefield 1977; Kinloch et al. 1999; Kinloch and Dupper 2002), and two recessive resistance genes conferring high resistance or immunity had been hypothesized in Idaho western white pine selections (Hoff and McDonald 1971; McDonald and Hoff 1970). In addition, certain parents of both species were thought to transmit PR, expressed as low infection frequency or increased proportions of infections that aborted (bark reactions; Hoff 1986; Kinloch and Davis 1996), or both mechanisms, resulting in reduced infection and mortality rates in their progeny (Kinloch and Byler 1981). Although these were collectively designated “low level” resistance in Bingham’s study plan, subsequent work by his group in the early 1970s clearly implicated two pairs of recessive genes responsible for most of the resistance observed in western white pine in the Idaho program Hoff and McDonald 1971; McDonald and Hoff 1970). Putative genotypes of all parent selections and expected segregation of their families are in Table 1.

Table 1 Identities, sources, and putative genotypes of sugar pine and western white pine parent trees, and expected ratios (susceptible/resistant) of their full-sib families in a blister rust disease gardena

All families except controls were from controlled pollinations. Open-pollinated seed lots from woods-run or known susceptible parent trees served as controls (two from each species and region). Altogether, there were 15 sugar pine families, including controls, eight from California and seven from Oregon, and 16 western white pine families, including controls, evenly divided between Idaho and Oregon.

Seedlings were grown in tar paper containers for 1 year, then transplanted to a permanent U.S. Forest Service field testing site in the Siskiyou Mountains near Happy Camp, California in the early spring of 1971. Seedlings were planted in a randomized complete block design with three replications of two blocks. Each block comprised each family of sugar pine and western white pine in 12 tree row plots. The site was on a nearly level bench, at an elevation of 2,600 feet, not far from the site where blister rust was first discovered in California. Ribes species are native to the area, but were augmented at the site by transplanting bushes from nearby natural stands and interplanting them at intervals between rows of test seedlings. The most prevalent species used was R. sanguineum, because of its size, rapid development, susceptibility to rust, and relative ability to retain infected leaves during summer droughts.

Seedlings were observed for characteristic blister rust needle spots in the spring of 1973, and for bark symptoms in 1973–1976, 1980, 1993, and 1996–1998. Certain trees that remained uninfected were reexamined in subsequent years.

Conformance of families to phenotypic segregation ratios predicted by genetic hypotheses in Table 1 were tested by chi-squared analysis, both before and several years after the observed breakdown of resistance on sugar pine (∼1976) and western white pine (∼1983), as a result of the appearance of blister rust races of wider virulence at the site (Kinloch and Dupper 2002).

Results

The epidemic

The epidemic started soon after plantation establishment. Needle spots were observed on seedlings in the spring of 1973, signifying inoculation and infection in the preceding autumn, and possibly as early as the autumn of 1971. Average incidence of needle infection was 53% on sugar pine families (range 29–80%) and 29% on western white pine families (range 15–48%). Much of this difference between the two species may have been due to corresponding differences in size and physiological vigor; western white pine seedlings were much smaller than sugar pine at the time of planting and for several years thereafter. First bark symptoms appeared in 1973, and by 1974 a few highly susceptible sugar pine families and controls were 100% infected and producing aeciospores. Western white pine controls at this time were less than 60% infected. Sugar pines averaged 2.5 cankers per infected tree (range 1.7–3.3; controls 2.4) compared with western white pines at 1.3 (range 1.0–1.8; controls 1.8). A brief hiatus in the epidemic, indicated by a flattening of infection and mortality rate curves in all families and controls of both species, occurred between about 1974 and 1976 (Figs. 1 and 2). Major wave years were known to occur in 1972, 1976, 1983, 1989, 1993, and 1997, with relatively minor amounts of infection occurring in most of the intervening years. (Evidence of wave years usually becomes apparent 1 or 2 years after the event in the form of stem symptoms). The rapidity and intensity of infection that occurred on sugar pine controls and highly susceptible families indicates that the probability of disease escape was virtually nil, and that any trees surviving by the 1980 year of record were expressing strong resistance.

Fig. 1
figure 1figure 1

Infection rates (diamonds) and mortality rates (squares) of white pine blister rust for full-sib sugar pine (SP) families in a disease garden near Happy Camp, California, 1973–1993. Family identification, state of origin (OR, Oregon; CA, California), and genotype at the locus for major gene resistance (Cr1, abbreviated R) are given in the first line of each caption. Sample size is on the second line. Parents suspected of having additional genes for “low level” resistance are denoted by (//). Dashed lines represent the maximum level of infection expected under Mendelian segregation for the parental cross indicated (cf. Table 1). Arrow indicates year of appearance of vcr1, the putative gene with specific virulence to Cr1

Fig. 2
figure 2figure 2

Infection rates (diamonds) and mortality rates (squares) of white pine blister rust for full-sib western white pine (WP) families in a disease garden near Happy Camp, California, 1973–1993 or 1998. Family identification, state of origin (OR, Oregon; ID, Idaho), and genotype at the locus for major gene resistance (Cr2, abbreviated R) are given in the first line of each caption. Sample size is on the second line. Parents suspected of having additional genes for “low level” resistance are denoted by (//). Dashed lines represent the maximum level of infection expected under Mendelian segregation for the parental cross indicated (cf. Table 1). Arrow indicates year of appearance of vcr2, the putative gene with specific virulence to Cr2

Major resistance genes in both pine species are vulnerable to blister rust races with specific virulence to them. Although Mendelian segregation in the pathogen is lacking, there is strong evidence that this specificity is based on a gene-for-gene relationship (Kinloch and Dupper 2002). The presumptive alleles have been designated vcr1 and vcr2, corresponding to the major resistance genes that they neutralize: Cr1 in sugar pine, and Cr2 in western white pine (Kinloch et al. 1999). These races appeared on the site sequentially, greatly altering the course of the epidemic for sugar pine, and to a lesser extent for western white pine. vcr1 on sugar pine first established in 1976, an unusually intense wave year, and soon became predominant, to the near exclusion of the wild-type race (Kinloch and Comstock 1981; Kinloch and Dupper 1987). The first infections on western white pine from vcr2 were dated to 1983 (though not detected until 1993). Inoculum assayed from telia taken from infected Ribes on the site confirmed the presence of vcr2 (Kinloch et al. 1999, 2004).

Family performance: sugar pine

By 1976, controls were more than 90% infected. Mortality lagged a few years, but by 1980 most, and by 1993 all, controls were dead from rust. Sugar pines with Cr1 performed as expected, segregating in Mendelian ratios up until the appearance vcr1 in 1976. As of that time, one family with both parents heterozygous for Cr1 had segregated in a 1:3 ratio (susceptible/resistant; 25% infection; SP1, Fig. 1). Seven families with one heterozygous and one homozygous recessive parent segregated in a 1:1 ratio (50% infection; SP5, 8, 9, 15, 17, 18, 21), and one family with one parent homozygous for Cr1 did not segregate (0% infection; SP16; Table 1 and Fig. 1). After the appearance of vcr1 (1976), Cr1 was neutralized, and all families with this gene showed highly significant chi-squared values (Table 1) and a sharp inflection in their infection and mortality rate curves (Fig. 1). Most went to 100% infection by 1993, but SP 16 and 17 were notable exceptions, with about 40 and 20%, respectively, of their offspring remaining healthy by 1993. Both families had K71, known to possess PR (Kinloch and Byler 1981; Kinloch and Davis 1996) as a common seed parent. In another family (SP 20, Fig. 1), however, K71 seedlings had infection rates equivalent to controls. Families with K17 as a common parent also had a few trees either healthy or at least surviving (Fig. 1, SP15, 17, 18, 21). In 2000, we inspected these trees and surviving trees of K71 for new infections from the 1993 or 1997 wave years, but found none.

With the exception of K71, other parents selected for PR performed poorly, and could not be distinguished from controls.

Family performance: western white pine

Controls of western white pine did not become infected nearly as rapidly as their sugar pine counterparts. Nevertheless, by 1993 most had been killed by rust (Fig. 2, WPC A-D). By 1980, four Oregon families from known Cr2 parents (WP1–4) had the expected phenotypic ratios of a segregating dominant gene, except that families WP1 and WP2 had fewer than expected infected trees (Table 1, Fig. 2). Although suspected earlier, vcr2 was not confirmed on the site until 1996, in inoculum taken from the site and assayed on known Cr2 western white pine in greenhouse tests (Kinloch et al. 1999). Reinspection of western white pine in 1998 revealed 62 incipient and some older infections on Cr2 trees that had been rust-free through 1980.

Infections were dated, and found to coincide with major wave years: 1983 (11%), 1989 (39%), 1993 (34%), and minor wave year 1995 (16%). At least half of these would not have been visible in the spring of 1993, the last complete inspection of all families in the plantation. Part of the uncertainty at that time was caused by the presence of many cankers on western white pines caused by Atropellis sp., that mimic blister rust cankers in early stages of development. These were mostly confined to branches and did only minor damage. So far, damage and mortality caused by vcr2 on the four western white pine families with Cr2 has not been as severe as that by vcr1 on sugar pine with Cr1, and one of them, WP2, was not affected at all (Table 1, Fig. 2).

Families WP5 and WP6 from Oregon, each with two putative PR parents, had lower infection and mortality rates than controls. WP5 approached the performance of Cr2 families (and Idaho families; see below). However, two families, WP3 and WP4 (Fig. 2), each with a different Cr2 pollen parent but PR seed parent in common, had higher infection rates than family WP2, which had no parent with putative PR.

Rate curves for western white pine controls from Idaho were essentially the same as Oregon controls. Families from Idaho showed relatively steady infection and mortality rate curves that culminated between 36 and 56% infection by 1993 and stabilized thereafter (Fig. 2). By 1998, none departed substantially from the segregation ratios expected from the two independently inherited, recessive gene loci hypothesized, except that WP 10 had significantly fewer than expected infected trees, while WP11 had marginally significantly more than expected (Table 1). Surviving trees were searched for new, incipient infections from 1997 through 2000, but none were found, even though 1997 was a known wave year at this test site.

Tolerance, the ability to survive with infection, was very low in both species (Figs. 1 and 2).

Discussion

Several of the objectives anticipated in the original study plan were realized long before completion of this analysis, in spite of the lack of data from two of the regions involved. MGR was confirmed in both sugar pine (Kinloch and Littlefield 1977) and western white pine (Kinloch et al. 1999); virulence to both of these genes was documented (Kinloch and Comstock 1981; McDonald et al. 1984; Kinloch et al. 1999), with probable gene-for-gene specificity (Kinloch and Dupper 2002); PR was demonstrated, and found to be relatively durable (Hoff 1986; Kinloch and Byler 1981; Kinloch and Davis 1996; Sniezko et al. 2004); and the relatively greater susceptibility of sugar pine over western white pine was confirmed, with diverse sources of each species, over a long duration in a common garden setting (Sniezko et al. 2004). None of these reports, however, dealt with all of these factors interacting together in two different species over the course of nearly half a rotation age. In this paper we show that some PR is strongly inherited and stable; that MGR is labile, but can remain effective when protected by PR; and that the two white pine hosts react differently to the same inoculum and environment.

Families in both species with a dominant resistance gene performed as expected, segregating in Mendelian ratios up until the time major genes were neutralized by their corresponding genes for virulence in the pathogen (Table 1; Figs. 1 and 2). Western white pine trees with Cr2 remained resistant for about 6 years beyond the time that sugar pines with Cr1 had died to vcr1. After attack by vcr2, infection rate curves on Cr2 genotypes were not as steep as in sugar pine with vcr1 and in one family (WP2) did not increase at all. Reasons for these differences between the two virulences are not known, but could be attributable to more and effective PR genes in the genetic background of these parents, or differences in gene frequencies between the two virulences.

PR clearly exists in sugar pine, but its prediction was erratic (cf. Table 1 and Fig. 1), emphasizing the need for long-term validation of complex traits in field trials. None of the Oregon families showed PR, including three originally thought to have it (SP8, 9, 10; Fig. 1). Two California sugar pine families did: both SP16 and SP17 had K71, known to have PR (Kinloch and Byler 1981; Kinloch and Davis 1996) as a common seed parent. In a third family (SP20, Fig. 1), however, K71 seedlings had infection rates equivalent to controls. Inconsistent transmission of PR from different sugar pine parents has been noted before (Kinloch and Byler 1981; Kinloch and Davis 1996), and may imply that PR is a function of specific, rather than general, combining ability.

All western white pine families, including controls, had lower infection and mortality rates than sugar pine families and controls (Figs. 1 and 2). Relative susceptibilities appear to be quantitatively inherited and intrinsic properties of the two species.

Most importantly, the data demonstrate that resistance in western white pine families lacking Cr2 is not specifically vulnerable to the virulence gene vcr2. The mode of inheritance of this resistance, however, remains uncertain.

Western white pine parents from Idaho were selected based on progeny performance in nursery inoculations (identified here from families WP10, 11, 12, 14 in Table 1 and Fig. 2) and/or from resistant F1 individuals surviving these tests (WP9 and WP15; Fig. 2; Bingham et al. 1960; Bingham 1966). Parents came from a population highly selected for phenotypic resistance in stands that had been heavily and repeatedly challenged by blister rust. Bingham (1983) estimated that incidence of trees free of rust in these stands was no more than one in 10,000. His original working hypothesis was that any resistance expressed would be inherited quantitatively (Bingham et al. 1960). Yet, there was evidence of recessive gene segregation ratios in progenies of several parents represented in early inoculation trials (Bingham et al. 1960 and Bingham 1966), including parents 17, 19, 22, and 58, all in the pedigrees of five of the families in Table 1.

Although Bingham (1966) emphatically rejected this interpretation, subsequent artificial inoculation experiments by his group tentatively identified two independently inherited recessive genes responsible for most of the observed resistance in the Idaho populations (Hoff et al. 1973). One of these loci was thought to control premature shedding of infected needles before the pathogen reached bark tissues (McDonald and Hoff 1970), while the other caused necrosis in needle and subjacent shoot tissues that arrested further fungal growth (Hoff and McDonald 1971). This interpretation became the basis of the segregation ratios hypothesized for the Idaho parents (Table 1).

F2 progenies from this population, on average, were 66% rust-free after artificial inoculation (Hoff et al. 1973). A weakness of the recessive gene hypothesis was that no homozygous families with 100% resistance were observed, but would have been expected from matings between homozygous recessive F1 parents (Hoff et al. 1973), unless something like incomplete penetrance or expressivity is invoked. The hypothesis was further burdened by the implication that all of the parents involved be heterozygous at both loci—a requirement that begs the question of how these parents, phenotypically resistant but genotypically susceptible (by definition), would have been selected in the first place. Finally, there were consistent and significant deficiencies in numbers of resistant seedlings in most families, as well as discrepancies in performance between selfed and outcrossed families of the same parents (Kinloch 1982).

Field tests of this material are few, and equally ambiguous. None includes individual pedigrees of putative parental genotypes (except data reported in this paper). The evidence from bulked F1 progenies of selected parents that had good general combining ability (GCA × GCA) in early nursery tests showed that only 21% became infected after 11–15 years exposure in northern Idaho vs 68% for controls and 80% for natural reproduction nearby (Steinhoff 1971). In a similar test in Bingham’s program that additionally included bulked F2 seed lots, F2s had only 12% infection after 2 years exposure in the field on a high-hazard site, compared with 31% infection of F1 bulked lots and 76% infection of controls (Bingham et al. 1973). Between 6 and 12 years exposure, controls reached 100% infection, but F2s only 50 to 72%. However, after 26 years in the field, F2s became almost completely infected (McDonald and Dekker-Robinson 1998). The latter authors interpreted this rate-reducing resistance of the F2 stock to be expressing horizontal resistance (sensu Vanderplank), based on additive genetic variance-a return to Bingham’s original hypothesis.

The data on the small Idaho population reported here are from six F1 and F2 full-sib families whose parents were among the elite of Bingham’s early selections, based on their progenies’ performance in earlier trials by artificial inoculation (Bingham et al. 1960; Bingham 1966; see also Bingham 1983 for a comprehensive discussion of all early progeny tests involving these and other parent trees). Levels of infection observed, which ranged between 36 and 56%, are reasonably consistent with the allelic composition hypothesized in Table 1 and Fig. 2, which were largely based on the needle shed and tissue necrosis mechanisms (McDonald and Hoff 1970; Hoff and McDonald 1971). Two of these families (WP9 and WP15; Table 1, Fig. 2) represent crosses between resistant F1 survivors of the elite parents, and had the lowest infection and mortality of the Idaho white pines: 39 and 36%, respectively, substantially below the 50% expected (Fig. 2). Oregon families WP 2, 3, and 5 performed similarly. Considered as a group, these ratios (∼1:1, or 50% resistance) could represent several simple genetic models, including dominance, incomplete dominance, two recessive genes acting together (giving a 9:7 ratio, or 44% resistance, statistically indistinguishable, in these data, from 50% resistance), or few quantitative loci (QTL). However, none of the data permits unambiguous interpretation of inheritance. Families are few, and there are internal inconsistencies among the several crosses. Because of the stability reached by the six Idaho families (Fig. 2), purely additive effects seem less likely; few QTL seem more probable.

Our data show that MGR is powerful but unstable, as expected. But they also indicate that strong, independent, and heritable PR exists to varying degrees in different genotypes of both species. Although proof is lacking, the PR we have observed over the decades in these species (e.g., SP16, 17, Fig. 1; WP 1–4; Fig. 2) behaves like non-specific resistance, and thus relatively invulnerable to further genetic change in the pathogen.

The genes controlling these two very different mechanisms and inheritance patterns present opportunities to exploit the best features of each. MGR confers virtual immunity to all extant variability in the rust except genotypes with the appropriate vcr gene for virulence. These are at low to very low frequencies in wild populations (Kinloch et al. 2004), although they can increase exponentially when selected for by concentrations of MGR hosts. Such selection ordinarily starts in relatively infrequent seasons of unusually high inoculum production—the so-called wave year—when the absolute frequency of vcr mutants is greatest. By combining PR with MGR, the effective inoculum potential should be reduced, and thus also the probability of infection, particularly by vcr spore genotypes.