Introduction

Over 40% of the world’s population is currently micronutrient deficient (a condition referred to as ‘Hidden Hunger’), resulting in numerous health problems, inflated economic costs borne by society, and learning disabilities for children (Branca and Ferrari 2002; Grantham-McGregor and Ani 1999; Ramakrishnan et al. 1999; Sanchez and Swaminathan 2005). The dietary intake of iron (Fe) of more than two billion people worldwide is inadequate (Stoltzfus 2001), making Fe-deficiency the most common micronutrient deficiency in the world (Graham et al. 2001). Almost two-thirds of all childhood deaths are associated with nutritional deficiencies (Caballero 2002). Although a diversification of diet to include micronutrient-rich traditional foods is the preferred solution to these challenges (Frison et al. 2006), many of the world’s poorest people do not have access to a wide variety of nutritionally dense food crops (Graham et al. 2001).

The large increases in the percentage of people suffering from micronutrient malnutrition over the last four decades coincide with the global expansion of high-yielding, input-responsive cereal cultivars (Welch 2002; Welch and Graham 2002). While global cereal grain yields have increased dramatically since the Green Revolution (Abeledo et al. 2003; Borlaug 1983; Slafer and Peltonen-Sainio 2001), global food systems are not providing sufficient micronutrients, resulting in an increased prevalence of micronutrient deficiencies (Welch 2002).

For much of the world’s population without access to diverse food crops, staple cereal grains are the primary dietary sources of micronutrients (Graham et al. 2001). Cereal crops could provide a significant increase in the overall micronutrient availability in human populations whose diets are dominated by staple crop consumption. For example, nutritional screening of staple food crops has suggested that micronutrient content of cereals could be twice the level found in commonly grown cultivars (Welch and Graham 1999). Bouis (2003) estimated that a doubling of Fe density in a staple crop would result in a 50% increase in total Fe intake for a population relying heavily on a single food (e.g., rice in Bangladesh). Most micronutrients account for less than 0.1% of the dry weight of a food, which indicates that significant increases in micronutrient levels are theoretically possible (DellaPenna 1999).

The objectives of this study were to compare the concentrations of eight minerals, including calcium (Ca), copper (Cu), iron (Fe), magnesium (Mg), manganese (Mn), phosphorus (P), selenium (Se), and zinc (Zn) in historical and modern spring wheat cultivars, and to explore the possibility of any biological trade-offs between mineral concentration and yield. Finally we sought to estimate the dietary significance of cultivar differences in mineral concentration.

Materials and methods

Experimental design

A randomized complete block design nursery containing 63 spring wheat cultivars (56 historical, 7 modern) was grown in Pullman, Washington, in 2004 and 2005. The historical cultivars were selected randomly from a larger group of spring wheat cultivars that were widely grown in the Pacific Northwest region of the USA from 1842 to 1965. The seven modern cultivars were among the most widely grown spring wheat cultivars in Washington State in 2003, representing approximately 69% of the total spring wheat area in the state (USDA 2005). Thirty-seven cultivars were in the soft white market class, 20 were in the hard red market class, four were in the hard white market class and two cultivars were in the soft red market class. There were three replicates of each cultivar in 2004 and four replicates of each in 2005.

The soil and climatic conditions found in Pullman are generally representative of the approximately 1.3 million hectare wheat-based farming region known as the Palouse, located in the higher rainfall zones (∼500 mm/year) of Eastern Washington and Northern Idaho. The nurseries were grown at the same location in both years on a Palouse silt loam soil. The nurseries were fertilized with PerfectBlend® fertilizer at the rate of 6.05 kg/ha each of N, P, and K, drilled with the seed at planting. No fungicidal or insecticidal seed treatments were used. This management practice was intended to reflect low-input wheat production in the Pacific Northwest. Plots consisted of seven rows at 18 cm spacing, 2.5 m long and 1.25 m wide. Plots were harvested with a Hege plot combine with stainless steel sieves and cleaned with a Hege seed cleaner with stainless steel sieves. The field replicates were bulked according to cultivar each year and then sent to the Grand Forks Human Nutrition Research Centre for mineral analysis.

Mineral analysis

The experimental design for the mineral analysis was a randomized complete block with three replicates per sample. Wheat samples were individually ground to a powder (15 s) in a coffee grinder (Krups, Model 208) with a stainless steel chamber and blade. The chamber and blade were thoroughly cleaned between samples to prevent cross contamination. For the analysis of Ca, Cu, Fe, Mg, Mn, P, and Zn, ∼0.4 g of each wheat sample was weighed and placed in triplicate into separate Pyrex beakers. Watchglasses were placed on the beakers and the samples were ashed in a muffle furnace at 200°C for 2 h and at 490°C for 12 h. The ash was dissolved in 10 ml of concentrated nitric acid (11.03 mol/l) (J.T. Baker Instra-Analyzed) and heated on a hotplate at 120°C for 2 h. Two milliliters of 30% hydrogen peroxide (J.T. Baker) were slowly added to each beaker and the mixture refluxed for 12 h. Samples were allowed to dry, and ashed again in a muffle furnace following the procedure outlined above. The resulting white ash was dissolved in 2 ml of 6 N HCl (1.2 mol/l) (J.T. Baker Instra-Analyzed) with heating and subsequently diluted to 10 ml with deionized water. Samples were analyzed simultaneously for Ca, Cu, Fe, Mg, Mn, P, and Zn by Inductively Coupled Argon Plasma (ICAP) techniques by using a Perkin Elmer 3300 instrument. Four NIST (National Institute of Standards and Technology, Gaithersburg, MD, USA) durum wheat standards and four acid blanks were run with each batch of samples.

For the analysis of Se, approximately 2.0 g of sample were placed into 100 ml Pyrex beakers and mixed with 10 ml concentrated nitric acid (11.03 mol/l), 5 ml of 40% magnesium nitrate solution, and 2 ml of concentrated HCl (4.8 mol/l). The beakers were covered with watch glasses for 24 h. The samples were then heated at ∼80°C for 24 h. The covers were removed and the samples were allowed to dry. They were then placed in a muffle furnace and held at 490°C for 7 h. After the samples cooled, 5 ml of deionized water and 5 ml of concentrated nitric (11.03 mol/l) were added and heated for 2 h at ∼80°C on a hotplate. The covers were then removed and the samples were allowed to air dry. The samples were returned to the furnace and held at 490°C for 4 h. Four milliliters concentrated HCl (4.8 mol/l) were added to the cooled samples and they were heated until the ash was dissolved. The ash was diluted to a final volume of 10 ml and analyzed by atomic absorption spectrometry using hydride generation. Three NIST durum wheat standards and three acid blanks were run with each batch of samples.

Statistical analysis

Data were analyzed using analysis of variance software PROC GLM (SAS Institute, Cary, NC). Levene’s test was used to test for homogeneity of variance across locations and normality was checked using the Shapiro-Wilk test in PROC Univariate (SAS Institute). Pearson’s correlation coefficients were calculated based on mean trait values and used to estimate phenotypic relationships between traits of interest. Significance was assessed at the 5% probability level, unless otherwise stated.

Bioavailability, Recommended Dietary Allowance (RDA) and Adequate Intake (AI) estimates

Bioavailability/absorption estimates were obtained with wheat as the target crop, for each mineral, using multiple sources (Table 1). RDA and AI levels were adapted from Dietary Reference Intake (DRI) reports from the National Academies Press (Table 2). Reports included the DRI for Calcium, Phosphorous, Magnesium, Vitamin D, and Fluoride (1997); DRI for Vitamin C, Vitamin E, Selenium, and Carotenoids (2000); and DRI for Vitamin A, Vitamin K, Arsenic, Boron, Chromium, Copper, Iodine, Iron, Manganese, Molybdenum, Nickel, Silicon, Vanadium, and Zinc (2001).

Table 1 Percent bioavailability/absorption estimates obtained using wheat as the target crop for each mineral
Table 2 Recommended Daily Allowance (RDA) or Adequate Intake (AI) levels of Ca, Cu, Fe, Mg, Mn, P, Se, and Zn

Results

Mineral concentrations and yield

Highly significant differences among the 63 wheat cultivars were found for yield and for mineral concentrations of all eight nutrients (P < 0.0001). Modern cultivars had higher yields than historical cultivars (P < 0.0001). Mean yield for historical cultivars was 1.090 ± 79 kg/ha and that for modern cultivars was 1.915 ± 242 kg/ha. For seven of the eight nutrients, the historical cultivars had significantly higher grain mineral concentrations than the modern cultivars (Table 3). Only Ca showed no significant difference between the historical and modern eras (P = 0.07). Highly significant variation existed among wheat cultivars for concentration of each mineral, indicating the potential for genetic improvement (Fig. 1, Table 4). A significant genotype × year interaction was found for each mineral. These interactions were scalar in nature and did not represent significant changes in cultivar ranks between years. Additionally, the values for the sums of squares for years were very small compared to those for genotype, indicating that most variation was due to genotype rather than year.

Table 3 Mineral concentration in historical and modern wheat cultivars
Fig. 1
figure 1

Regressions of wheat seed mineral nutrient content on the date of variety release for soft white (SW) spring wheat (open triangle, light solid line) and hard red (HR) spring wheat (closed diamond, dark solid line). Regression lines are best-fit simple linear regression model

Table 4 Variety name, market class (MC), year of release or introduction to the Pacific Northwest (year), grain yield (kg/ha), Ca, Cu, Fe, Mg, Mn, P, Se and Zn concentrations, for each spring wheat cultivar used in the study

Yield—mineral concentration trade-off

For Ca, Cu, Mg, Mn, P and Se, yield was negatively correlated with mineral concentration (Table 3). Fe and Zn showed no significant correlation with yield (Table 3). A trade-off between yield and Ca, Cu, Mg, Mn, P, and Se concentrations is indicated; however, this trade-off could either be genetically based or the result of inadvertent negative selection pressure on mineral concentration by wheat breeders.

Thousand kernel weight (TKW) was positively correlated with grain yield (r = 0.60, P < 0.001), and concentrations of Cu, Mn, Se and Zn (Table 5). There was no correlation between TKW and concentrations of Ca or Fe, and a negative correlation existed (P = 0.028) between TKW and the concentrations of Mg and P. The mean TKW of the modern wheat cultivars was 34.6 g, whereas the historical cultivars had a lower mean TKW of 30.0 g (P = 0.004).

Table 5 Correlations between minerals in spring wheat cultivars grown in Pullman, WA in 2004 and 2005 using Pearson correlation coefficients

Regressions of mineral concentrations on year of release were separated according to either hard red or soft white market classes (Fig. 1). Significant decreases were shown among soft white cultivars for all minerals except Ca and Mg. Among hard red cultivars, only Zn decreased over time, whereas Mg increased over time (Fig. 1). All other mineral nutrients remained stable among HR cultivars over the past 120 years.

Ca concentration was positively correlated with that of all the other minerals (r = 0.19–0.59, Table 5), indicating that selection for increased grain concentration of Ca may increase grain contents of other minerals if the correlations are genetically based. Eight cultivars (Lemhi, Lemhi 66, Mackey, Idaed 59, Little Club, Big Club, New Zealand and Hybrid 143) were ranked among the top 12 (top 20%) for concentration of four or more minerals and four of these cultivars (Lemhi, Lemhi 66, Mackey and Idaed 59) were ranked among the top 12 for concentration of six or more minerals (Table 4). Correlations were especially strong (r ≥ 0.90) between the concentrations of Cu and Mn, Cu and Zn, and Zn and Mn.

Discussion

Using historical and modern data, overall decreases in mineral content of vegetables and horticultural crops were reported over a 50-year period (Davis et al. 2004; White and Broadley 2005), and a trade-off theory was proposed for the interaction between yield and mineral content (Davis et al. 2004). However, differences in cultivars, environments, sampling and laboratory analytical methods could be responsible for the reported reduction in mineral content in any one crop (Davis et al. 2004).

The dilution effect, due to bran versus endosperm ratio change, may play a role in the negative relationship between yield and mineral content for cereal crops. Previous studies showed negative associations between grain yield and grain nitrogen concentration (Cox et al. 1985; Heitholt et al. 1990; Pepe and Heiner 1975), indicating that the dilution of nitrogen compounds in high-yielding grains was a consequence of the breeding process during this century (Calderini et al. 1995). Genetic gains in grain yield of US hard red winter wheat cultivars may be related to reduced Fe, Zn, and Se concentrations (Garvin et al. 2006). Other studies, however, indicated that micronutrient-rich cultivars can also be higher yielding than less micronutrient-rich cultivars, especially when grown on soils with low micronutrient availability (Graham et al. 2001; Rengel 2001).

While yield and mineral concentration were usually negatively associated, the correlations were weak and exceptions existed for high yielding cultivars with moderately high levels of certain minerals. Of the top 12 yielding cultivars, two were in the top 12 for P concentration, one was in the top 12 for each of Fe, Mg, Mn, and Se, and none was in the top 12 for Ca, Cu or Zn (Table 3). Of the lowest yielding 12 cultivars, however, six were among the top 12 in Ca, Cu, and P concentration, five were in the top 12 for Mg and Zn, three for Fe and Se, and two for Mn. Though most of the highest-ranking cultivars for mineral concentration were relatively low yielding, several high yielding cultivars contained high concentrations of certain minerals, indicating the possibility that simultaneous genetic gains could be made for both yield and mineral concentration (Table 4). For example, high yielding cultivars Spinkota, Zak and White Marquis had high concentrations of Fe, Mn and Mg, respectively.

Eight cultivars were ranked among the top 12 for concentrations of four or more minerals and four were ranked among the top 12 for six or more minerals (Table 4). This suggests that it is possible to select cultivars with enhanced levels of multiple nutrients. Correlations were especially strong (r ≥ 0.90) for Cu and Mn, Cu and Zn, and Zn and Mn, indicating that selection for cultivars with high concentrations of Cu, Zn, and Mn would be particularly effective.

Fe and Zn are the two most important mineral nutrients contributing to micronutrient deficiency (Welch and Graham 1999). Fe and Zn were the only minerals studied whose contents were not negatively correlated with yield (Table 3), indicating that in the presence of positive selection pressure, Fe and Zn would be the minerals most likely to increase simultaneously with yield. This corresponds with recent findings by Uauy et al. (2006) that show both Fe and Zn content are influenced by the high protein gene Gpc-B1.

Mineral concentration was regressed on the date of variety release for both the soft white and hard red spring wheat market class groups. Over a 120-year time span, hard red cultivars had a neutral or slightly positive trend for each mineral, except Zn, which showed a slightly negative trend and Mg, which showed a slightly positive trend (Fig. 1). The soft white wheat group, however, had either a nutritional decline or, in the cases of Ca and Mg, a neutral trend for mineral concentration over time. This indicates that plant breeders may be responsible for the decline in nutritional concentration in soft white wheat cultivars, where selection often occurs for low ash content. High ash content has deleterious effects on the end-use quality of products made from soft white wheat. Ash contains minerals and ash in flour can affect color, giving a darker color to finished products. Products that require white flour need low ash content whereas other products, such as whole wheat flour, have higher ash contents. For hard red wheats grown in the Pacific Northwest region, where ash content is not strongly selected against, mineral content does not show a negative trend.

Dietary significance of mineral nutrient decline

How do these results translate into current levels of the Recommended Dietary Allowance (RDA) and/or Adequate Intake (AI)? To understand the dietary consequences of the decrease in mineral concentration in modern wheat cultivars, the numbers of slices of whole wheat bread necessary to achieve either the RDA or AI levels were estimated for five age/gender groups: (1) children aged 4 to 8; (2) males aged 19 to 30; (3) females aged 19 to 30; (4) males aged 31 to 50 and; (5) females aged 50 to 70. These age/gender groups are sample demographics representative, but not inclusive, of the human population. A whole wheat loaf of bread was estimated at 1,000 g, with 20 slices per loaf, therefore each slice was estimated to weigh 50 g. Although bread is typically made with hard red wheat, all market classes of wheat were included in this estimate, because breeders will likely use genes for high mineral content from any market class and introgress these genes into locally adapted, high yielding hard red cultivars. In addition, whole wheat bread is simply used as an example that can be extended to other end-use products, including steamed bread, cookies, sponge cakes, and pasta, all of which are usually made from white wheat.

RDA’s are set to meet the needs of 97–98% of individuals in a group. AI is believed to cover the needs of all individuals in a group, but lack of sufficient data prevents the ability to specify with confidence the percentage of individuals covered by this intake. RDA and AI estimates are shown in Table 2. The estimates of dietary significance herein are based on the assumption that all cultivars have the same level of bioavailability.

Although the increased yield of modern cultivars could potentially increase the mineral content per acre of grain production, the mineral concentration per seed or loaf of bread is reduced. This reduction in per loaf mineral concentration results in a necessary increase in consumption of bread made from modern wheat cultivars to reach the same level of mineral content in bread made from historical wheat cultivars with high mineral nutrient content. In addition to historical cultivars high in mineral concentration, many historical cultivars have low levels of certain minerals.

The seven modern cultivars were compared with seven historical cultivars chosen because they contained high levels of each mineral. Figures 2 and 3 show that for all eight minerals more bread is required to meet the RDA or AI in modern cultivars than in the nutritionally dense historical cultivars for each age/gender group. For example, females aged 19 to 30 would have to eat 10.6 slices of whole wheat bread made with flour from historical cultivars high in Zn to reach the RDA, but would require 15.2 slices of bread made with flour from modern cultivars to achieve the same RDA (Fig. 2). Males aged 31 to 50 would need 13 slices of bread made from historical cultivars high in Cu and 19.1 slices of bread made from modern cultivars to reach the RDA (Fig. 2).

Fig. 2
figure 2

Estimated numbers of slices of bread required to meet the Recommended Dietary Allowance (RDA) levels for Zn, Cu, Mg, and P, with flour from modern (denoted ‘Top 7 Modern’) and historical cultivars with high mineral concentrations (denoted ‘Top 7 Historical’). Each slice is equivalent to 50 g whole wheat flour

Fig. 3
figure 3

Estimated numbers of slices of bread required to meet the Recommended Dietary Allowance (RDA) or Adequate Intake (AI) levels with flour from modern (denoted ‘Top 7 Modern’) and historical cultivars with high mineral concentrations (denoted ‘Top 7 Historical’). RDA was used for Fe, and Se. AI was used for Ca and Mn. Each slice is equivalent to 50 g whole wheat flour

It is unreasonable to expect acquisition of a significant percentage of the RDA of certain minerals from eating whole wheat bread (Fig. 3). For example, to reach the RDA of Se would require consumption of 55 slices of bread made from cultivars high in Se and 123.5 slices of bread made from modern cultivars for males and females aged 19 to 30 (Fig. 3). Additionally, to reach the RDA of Ca would require consumption of 39 slices of bread made from historical cultivars high in Ca and 51 slices of bread made from modern cultivars for children aged 4 to 8 (Fig. 3). Greater potential dietary impact may be realized from increased consumption of bread made with cultivars containing high levels of Zn, Cu, Mg and P (Fig. 2). For example, children aged 4 to 8 would need only seven slices of bread made from cultivars high in Zn and six slices of bread made from cultivars high in Cu to achieve the RDA for these minerals (Fig. 2). These estimates should be regarded with caution, because they are dependent on bioavailability estimates among cultivars, which were untested in this study.

Here we report significant statistical and dietary differences between modern and historical cultivars of wheat and show the need to reverse this trend. Increases in consumption of mineral nutrients can be accomplished partially through plant breeding, but requires other approaches as well. Additional approaches emphasize the necessity of improving dietary food diversity, linking agricultural biodiversity to nutrition, and the mobilization of indigenous and traditional food systems in a multi-faceted approach to reducing worldwide nutritional deficiencies (Frison et al. 2006).