Introduction

Controversies are common and unavoidable in scientific research, and they represent a major impetus for additional research. However, there are probably few areas as controversial as the health effects of exposure to selenium, a metalloid of toxicological and nutritional interest for many living organisms [1, 2, 3•, 4•, 5]. Exposure to this trace element mainly occurs through diet, particularly through intakes of fish, seafood, and meat [6], being generally limited to few dozens of micrograms a day. Additional sources of selenium exposure are cigarette smoking [7], traffic-related air pollution [8, 9], coal combustion [10,11,12], and occupational exposures [13], as well as dietary supplements [14,15,16].

The possibility that this trace element can improve or harm human health depending on the dose has been suggested by a large number of studies [17,18,19,20,21]. However, despite hundreds of epidemiologic investigations on this topic, evidence about the amount of exposure and the specific health outcomes affected by selenium exposure is limited [5, 22•]. At the present time, most selenium experts would agree on the need to avoid too low or too high intakes of this element. However, there is a lack of consensus regarding the safe range of exposure and disagreement as to the veracity of some of the purported associations between selenium exposure and health outcomes such as cancer [3•, 5, 23,24,25,26]. The availability of well-conducted environmental epidemiologic studies and experimental studies (as randomized controlled trials) and the more in-depth insights from laboratory studies have improved our knowledge of the health effects of environmental selenium, the tools for monitoring its exposure, and the need to regulate human exposure to this element [1, 17, 20, 27,28,29,30,31]. While the first period of investigation has concerned the potential for harm of this element, and the second its possible beneficial effects, accumulating evidence from recent studies has highlighted again the potential toxicity of selenium overexposure [5, 26]. These recent studies have suggested that overexposure may occur at much lower levels than believed. They also implicated new diseases associated with excess selenium intake and diseases once thought only to arise with selenium deficiency. These diseases include diabetes; hypertension; neurodegenerative diseases such as amyotrophic lateral sclerosis, Parkinson’s disease, and Alzheimer’s dementia; and cancer [3•, 22•, 32, 33, 34•, 35, 36]. In this review, we summarize the most recent lines of evidence concerning the human health effects of environmental selenium, highlighting key issues currently at the forefront of this research.

Environmental Studies

A PubMed search on the human health effects related to environmental selenium exposure shows that such observational studies can be split into two subgroups. One subgroup comprises the investigations carried out in environmental contexts characterized by marked deficient or excess exposures to selenium. Another set of studies has been carried out in non-seleniferous geographic areas, generally Western countries, where investigators have examined the association between environmental selenium and health endpoints. These investigators typically address hypotheses of either beneficial or adverse effects of selenium at the roughly “intermediate” exposure levels that characterize these study populations. For a detailed assessment of these studies, we refer to previous reviews [3•, 5, 26, 37].

Concerning the environmental studies, pioneering studies on naturally occurring selenium overexposure were performed in Northern and Southern America and were later followed by studies in China and other parts of the world [26, 38] (Supplemental Fig. S1). The key details of these studies are reported in Table 1. More recently, new studies have been carried out in other seleniferous areas, such as the Brazilian Amazon [50], the Inuit population of Canada [47, 66], and a seleniferous area in Punjab [64], with Chinese investigations still continuing to provide relevant information on this issue [67, 68]. Environmental exposure to selenium in its inorganic tetravalent form has also occurred in a Northern Italian community, and this has been the only investigation of chronic disease risk using a longitudinal study design [53, 54•]. Overall, these studies have identified toxicity of selenium to a large number of body organs and systems, such as the liver, the skin, the endocrine system, and the nervous system. Despite the nonexperimental nature of most studies, lack of replication of results on some endpoints, and other methodological limitations, the overall results provide evidence of toxicity of naturally occurring selenium at high levels of exposure worldwide (Table 1). However, these studies have been limited in clarifying the exact amounts of exposure that are harmful, generally due to inadequate data on biomarkers of selenium intake [38]. An exception to this pattern has been the observation of an inverse association between selenium exposure and triiodothyronine levels in children living in a seleniferous area of Venezuela starting at around 350 μg/day of selenium intake [48]. This finding is of interest by taking into account the recent results of a Danish trial, where selenium supplementation of 100 to 300 μg/day adversely affected thyroid function in a dose-dependent manner, decreasing serum TSH and FT4 concentrations [69]. In that study, mean participant age was 66.1 years, baseline plasma selenium levels were to 87.3 μg/L, and the supplemental selenium is expected to have been almost entirely organic, being administered as selenium-enriched yeast. In contrast, no effect of supplementation was reported in an UK trial among the elderly [70] following administration of 100, 200, or 300 μg/day of the element through selenium-enriched yeast, despite similar background selenium status (plasma levels 91.3 μg/L).

Table 1 Overview of studies on the health effects of environmental selenium (Se)

In addition to selenium overexposure, selenium deficiency has been suggested to play a major role in the etiology of a rare but severe cardiomyopathy described in Chinese populations, Keshan disease [59, 71]. The major pieces of evidence linking the etiology of Keshan disease to selenium deficiency have been its higher frequency in regions low in selenium, and the beneficial effect of selenium in its inorganic tetravalent form in community trials on the incidence of Keshan disease [59, 71,72,73]. However, these community-based trials may have been prone to bias, as they did not include a double-blind design, randomization, or allocation concealment on an individual basis [59, 71, 74]. In addition, some epidemiologic features are not consistent with a causal association with selenium, such as the seasonal occurrence of Keshan disease, which is more compatible with an infectious etiology [75, 76], the decrease over time of the disease independently from major changes in selenium supply, and the lack of Keshan disease eradication despite increased selenium intake [72, 77, 78]. Furthermore, well-known antiviral effects of selenium in its inorganic form may explain the decreasing incidence of Keshan disease after selenite administration [79]. Even though an association between selenium deficiency and Keshan disease has not been firmly established, recent studies linking this disease to selenium deficiency have prompted recommendations by WHO/FAO to avoid selenium intakes below 13–19 μg/day in adults (i.e., thresholds above which Keshan disease has not occurred) [80].

Another selenium-responsive disease is Kashin-Beck, a chronic degenerative disorder of peripheral joints and spine [80]. This osteoarthropathy has been described in some Chinese areas in both children and adults, and is also considered a disease of multifactorial etiology, mainly due to inadequate nutritional status [81]. Selenium deficiency possibly plays a role considering ecologic data showing low selenium intake in disease-affected areas and the effectiveness of selenium supplementation in reducing Kashin-Beck disease incidence [80, 82, 83].

In addition to the limitations of exposure assessment in environmental observational studies investigating selenium deficiency and excess, it must be noted that little if any emphasis has been given to the specific chemical forms of selenium involved in such settings. In fact, the selenium found in foods, drinking water, and other environmental matrices such as soil and ambient air may exist in several inorganic and organic chemical forms, and comprise different selenium compounds [3•, 84,85,86]. The toxicity of selenium species and compounds may differ markedly, and it is generally much higher for the inorganic species (such as selenate and selenite) and some organic form (such as selenomethionine) [3•, 36, 87, 88]. Unfortunately, little is known about selenium speciation in most environmental matrices, and this is also true for human tissues and compartments. In addition, the various chemical forms of selenium may have different excretion rates, being faster for the inorganic forms, as well as metabolism and distribution in body tissues. Unfortunately, selenium speciation in both environmental and biological matrices is analytically complex and resource-consuming, which may explain the paucity of data in this field [3•, 89, 90].

A second key limitation of environmental studies has been the little attention given to neurological diseases and disorders [91], except for amyotrophic lateral sclerosis and Parkinson’s disease after low-dose overexposure to inorganic hexavalent selenium [53, 92, 93] and neurological abnormalities in a high-selenium environment [58]. This contrasts with the growing evidence from both clinical and laboratory studies that selenium exposure, and particularly overexposure, may induce neurotoxic effects [4•, 5, 33, 36, 53, 88, 91, 93,94,95,96,97,98,99,100]. Selenium exposure might also affect cognitive functions both in adults and children, though positive, null, and inverse associations have been reported [101,102,103,104,105,106].

The possible occurrence of adverse health effects for selenium exposures in the “average” or intermediate range, which are typically found in Western countries, is also an issue of strong interest [25]. This is especially true considering scientific claims that several Western populations worldwide might suffer from selenium deficiency and low selenium status [23, 107, 108] or claims of a beneficial effect of selenium in cancer prevention issued in the early 2000s [24, 109]. Conversely, recent observational studies have suggested that the exposure levels found in countries not definable as “selenium deficient” or “seleniferous” might be associated with adverse effects attributed to selenium overexposure, such as excess risk of childhood leukemia [8] and cardiovascular disease [9], esophageal dysphagia [110], Alzheimer’s dementia [36], hypertension [32, 111, 112], and type 2 diabetes [113•]. Despite the potential weaknesses of these studies due to their nonexperimental design (apart from diabetes risk), their results suggest that previous research driven by claims of beneficial health effects of selenium have obscured the detection of the adverse effects due to overexposure, and these adverse effects might occur at much lower exposure levels than believed.

Randomized Controlled Trials

Differently from all other toxic elements and most trace elements of nutritional relevance, selenium has been investigated in experimental studies, generally in the form of randomized, controlled, and double-blinded trials (RCTs) [5, 22•]. The key advantage of this study design is the better control of both measured and unmeasured confounding, compared with nonexperimental studies. Unfortunately, RCTs have rarely been implemented in areas known or suspected to be low in selenium exposure, with the exception of those aiming at preventing the incidence of Keshan disease or Kashin-Beck disease [5]. Most RCTs have been conducted in Western populations (mainly North America) where increased nutritional availability of selenium, even in the absence of overt nutritional deficiencies, was envisaged to protect against chronic diseases, particularly cancer, and more rarely other health disturbances such as metabolic abnormalities or thyroid diseases [5]. The key details and location of the RCTs that were designed to test the ability of selenium to prevent cancer are shown in Table 2 and Supplemental Fig. S2.

Table 2 List of randomized placebo controlled trials using selenium supplementation (abbreviations: Se, selenium; HGPIN, high-grade prostatic intraepithelial neoplasia; NMSC, non-melanoma skin cancer; RR, rate ratio; HR, hazard ratio)

Compared with nonexperimental studies, experimental studies are better able to control for confounding and reduce potential for exposure misclassification [129]. However, experimental studies may still be hampered by limitations due to variability and range in selenium exposure, the specific population considered (in some cases affected by a disease), and the selenium species being administered [22•, 26]. In addition, some difficulties arise when attempting to compare results of experimental studies with those generated by the nonexperimental “environmental” studies, for differences related to both the specific outcomes investigated the amounts of exposure.

While experimental studies in regions affected by selenium deficiency sought to assess the beneficial effects of the element for Keshan and Kashin-Beck diseases, studies in Western populations typically sought to assess the risk of cancer, particularly prostate cancer, as reviewed elsewhere [22•, 130]. Other endpoints such as cardiovascular diseases [131, 132] have been frequently included. Unfortunately, high-quality trials on cancer and cardiovascular disease have not been conducted in geographic areas characterized by very low selenium intake, or in the few instances in which low exposure was studied, the methodological quality of the trial was low [22•]. More occasionally in Western populations, RCTs based on selective administration of selenium have been designed to assess additional health outcomes such as acute illness and septic shock [133, 134], dementia [135•], blood cholesterol levels [136], thyroid function [69, 137], immunity [138], and HIV infection [139,140,141]. Almost all RCTs assessing risk of cancer and occasionally cardiovascular disease have been carried out in the USA [22•, 130], despite the observation that US selenium levels tend to be much higher than that in other Western countries, particularly European ones. In fact, NHANES data showed median serum selenium levels in the US population on the order of 134 μg/L and 193 μg/L in the 2003–2004 and 2011–2012 surveys, respectively [142, 143], while average serum/plasma selenium levels in the European populations were generally lower than 100 μg/L, in the 50–120 μg/L range [18, 127, 144]. Greater interest in performing randomized trials with selenium in the USA, despite the higher average selenium exposure, has been likely due to enthusiasm generated by the promising ad interim results of the NPC trial [118, 145]. That trial has likely influenced the marked increase in selenium and multivitamin supplementation in the USA [146], despite the lack of clear evidence of a beneficial effect [147, 148].

Overall, an evaluation of the RCT results shows that in almost all RCT studies, there was no beneficial effect on cancer or cardiovascular disease following selenium supplementation, particularly when looking at the high-quality studies [22•]. One additional small RCT [108] has been published after a recent Cochrane review on the relation between selenium supplementation and subsequent cancer incidence [22•], but the results [108] did not change the previously published summary rate ratio (RR). In Fig. 1, summary RRs for cancer mortality and incidence are reported, including the newly published trial [108], based on all RCTs and RCTs at low risk of bias.

Fig. 1
figure 1

Overall rate ratios (RR) with their 95% confidence intervals (CI) of any cancer mortality (a) and any cancer incidence (b) in randomized controlled trials encompassing selective selenium administration, overall and among studies with low risk of bias (RoB)

No clear dose-response association between selenium intake and cancer risk has emerged from these trials [22•]. This led to not only a dismissal of claims about any cancer-preventive effect of selenium, but also to concern following detection of unexpected and, in some cases, serious adverse health effects among selenium-supplemented individuals. Adverse effects ranged from dermatological side effects to diabetes and cancer, namely high-grade prostate cancer [34•, 130, 149]. These adverse effects occurred at exposures much lower than expected, thus suggesting the inadequacy of the selenium standards and upper limits established to date [22•, 26]. The excess risk of high-grade prostate cancer in selenium-supplemented US individuals having the highest background selenium exposure is of particular concern, given growing evidence that some selenium species and selenoproteins have been associated with increased cancer risk in laboratory models [150,151,152,153,154,155, 156•, 157, 158] and in some recent cohort studies [159, 160]. The excess risk of diabetes is also of considerable interest, having been consistently shown to be associated with selenium in both experimental and nonexperimental studies, and at low levels of exposure. In the SELECT trial, the most informative RCT designed on selenium and cancer, excess diabetes incidence in the selenium arm influenced the trial’s termination [125, 161].

The adverse health effects of selenium observed in RCTs have raised safety issues for the remaining ongoing RCTs and more generally for the safety of selenium exposure and supplementation in humans, leading to warnings of side effects associated with supplementation, in the absence of selenium deficiency [35, 53, 162,163,164]. It should be noted that selenium levels associated with adverse effects in these RCTs are in a range of exposure relevant to the general population of several Western countries [22•].

Biomarkers of Exposure

The search for biomarkers of selenium exposure has long attracted investigators seeking indicators of short-term and particularly long-term intake, as well as indicators of a range of exposures permitting the evaluation of harmful and beneficial effects of this element [18, 161, 165, 166].

The most commonly used biomarker of selenium exposure has been serum or plasma selenium levels, and far less frequently whole blood selenium or erythrocyte selenium content [18, 22•, 161]. Whole blood selenium levels, however, can be more difficult to interpret, as they comprise both cellular and non-cellular constituents that have specific and non-specific components, and few studies are available for whole blood selenium that investigate inter-individual heterogeneity [166]. In addition, cellular (erythrocyte-bound) selenium levels appear to be less responsive to changes in dietary intake of the element compared with plasma/serum selenium, making it more difficult to compare whole blood selenium levels across individuals [18]. Plasma/serum selenium tends to reflect exposure up to few days and weeks, and also has the advantage of allowing speciation analysis, an approach which is becoming much more common and relevant. As previously mentioned, this follows the growing awareness of the different and peculiar biological properties of the various selenium species [36, 79, 90, 100, 167]. A recent study indicates that total selenium content in serum correlates with levels of only three selenium species: serum albumin-bound selenium, selenocysteine, and glutathione-peroxidase-bound selenium. Conversely, for other chemical forms of the element, such correlation exists [90]. Therefore, the most commonly used biomarker of selenium exposure in epidemiologic studies, total plasma/serum selenium content, may be inadequate to assess circulating levels of some species of the metalloid. In addition, serum selenium species vary according to diet composition [6], either for the different composition in selenium chemical forms of the different foodstuffs or for metabolic reasons [168, 169]. Several other indicators have been proposed and adopted in both nonexperimental and experimental studies, including in particular nail and hair selenium levels [18, 38]. These biomarkers have the substantial advantage of reflecting more long-term exposure compared with plasma/serum selenium levels and are considerably more suitable for epidemiologic research and clinical screening being less invasive and better tolerated by study participants. However, the ability of hair and nail measures to reflect actual exposure to selenium has been challenged, on the basis of the low correlation with both blood selenium levels and dietary selenium intake seen in some studies, despite indicating substantially stable selenium exposure over time [22•, 56, 170]. This might also be due to a tendency for some tissues to preferentially accumulate some selenium species, generally the organic ones, compared with other chemical forms and compartments, also depending on exposure to other factors such as methionine and heavy metals [3•, 161, 171,172,173,174]. However, even if there is some evidence for differential storage of selenium species in the nails and other body tissues and compartments (such as hair and urine), still limited data exist on these relevant issues [90, 161, 170, 172, 175]. Nails and hair also appear to be unsuitable for speciation analysis because of difficulties in the extraction procedures, and also since in these tissues some selenium species (such as inorganic ones) may be less likely incorporated compared with other selenium forms [170].

Urinary selenium levels have also been proposed as a suitable marker to assess selenium exposure, but their reliability as biomarker of selenium exposure has not been well-studied [165]. In addition, urinary selenium levels appear to be an indicator of recent intake of the metalloid, rather than of its long-term exposure [165, 176]. Overall, these findings confirm that misclassification of exposure is a major issue in selenium research in humans, regardless of the biomarker adopted to assess selenium status [161, 177], and this is particularly true when speciation analysis is not included in the assessment. This greatly hampers exposure assessment in a living organism and represents a source of bias in epidemiologic studies.

More recently, a growing number of studies has used an additional, highly specific biomarker of selenium exposure, cerebrospinal fluid selenium level (CSF), though this indicator is clearly unsuitable for population-based studies [36, 98, 100, 167]. This indicator, in fact, is unique in allowing in vivo biomonitoring of selenium levels in the central nervous system, which may have relevance given the potential involvement of selenium in neurological disease [4•, 36, 97]. In addition, it allows the implementation of speciation analysis [36•, 89, 96,97,98, 100, 167, 178]. However, blood and CSF levels of some selenium species are uncorrelated. In fact, relying on peripheral indicators of selenium exposure, such as blood levels, is not ideal for assessing corresponding exposure in the central nervous system compartment [89, 100, 167, 179].

Proteomic analysis based on measuring the induction of synthesis of selenoproteins is another widely used approach to assess selenium exposure [3•, 18, 144, 165, 168]. Selenoproteins are proteins that contain at least one of the amino acid selenocysteine, and generally serve oxidoreductase functions, though their exact physiopathological functions are still partially obscure and conflicting [27, 180,181,182,183]. The maximal expression of selenoproteins, such as plasma levels of selenium-dependent glutathione peroxidase GPX1 and of selenoprotein P, has been generally considered as an indicator of an adequate selenium intake through diet and other sources [3•, 18]. This has been done under the hypothesis that lower levels of selenoproteins derive from an insufficient bioavailability of selenium associated with its inadequate intake. Therefore, most agencies have based their assessment of selenium dietary reference values on the intake needed to upregulate selenoprotein expression [3•, 26], with values ranging from 55 to 70 μg/day (Fig. 2) [25, 26]. However, this approach to assess selenium dietary requirements has been challenged [3•, 26] since it has been suggested that the selenium-induced maximization of antioxidant enzyme synthesis, including but not limited to selenoproteins, may derive from the pro-oxidant properties of selenium species [99, 188,189,190,191,192], as long recognized [193]. Accordingly, even in the absence of any change in selenium supply, the induction of oxidative stress by several environmental stressors may increase selenoprotein synthesis. These observations suggest that the basal selenoprotein levels are not a direct sign of inadequate availability of selenium, being their levels inducible within the physiological response to stress. Therefore, “low” levels of these selenoproteins should not be confused with selenium deficiency per se [194], being potentially attributable also to the pro-oxidant properties of the element [3•]. The phylogenetic analysis of selenium utilization in mammals and lower animals also raises questions regarding the need to maximize selenoprotein expression [195]. In addition, environmental studies have shown that changes in selenium exposure are unrelated to changes in selenium-containing glutathione peroxidase levels [43, 46]. Finally, little if any demonstration of adverse health effects is attributable to inadequate selenoprotein synthesis according to the available epidemiologic evidence [26, 80]. Therefore, the approach taken by WHO/FAO in assessing the dietary reference values for selenium, i.e., 26 μg/day for females and 34 μg/day for males, may be reasonable since it is not aimed at maximizing selenoproteins expression.

Fig. 2
figure 2

Selenium standards issued by different countries and authorities worldwide, including the acceptable daily intakes/recommended dietary allowances and upper limits, and the lowest-observed-effects levels from environmental and experimental human studies to which uncertainty factors of 3 and 10 are applied. Abbreviations: D-A-CH, German, Austrian and Swiss Nutrition Societies; EFSA, European Food Safety Authority; JAPAN, Japanese National Institute of Health and Nutrition; NORDIC, Nordic Nutrition Recommendations; SINU, Italian Nutrition Society; IOM, Institute of Medicine; WHO, World Health Organization. Adapted from references [25, 34•, 48, 108, 114, 115, 125, 126, 184,185,186,187]

Overall, proteomic indicators such as selenoprotein expression may be inappropriate to assess the adequacy of selenium exposure [26, 80]. This is also true for the use of selenoproteins to assess selenium overexposure, since the highest levels of these proteins already reflect overexposure to selenium, but cannot reflect further increase in exposure. The proteins reach a plateau in serum or plasma at selenium intakes of around 70 μg/day, depending on the specific selenoproteins (and selenium species) involved. Selenoprotein expression, therefore, appears to be an inadequate tool to assess and monitor selenium exposure, both in case of deficient and excess exposures, and to assess to which chemical forms of this element the human has been exposed. It should also be noted that other proteins in addition to selenoproteins have been shown to be affected (i.e., upregulated or downregulated) by selenium exposure [196,197,198,199,200]. However, the physiopathological mechanisms underlying this relation and the suitability of these proteins to monitor selenium compound exposure, including its specificity, have not been elucidated.

To overcome some of the inherent limitations of biomarkers in assessing selenium intake, the assessment of selenium content of usual diet (e.g., via semi-quantitative food frequency questionnaire) and other relevant sources of exposure, such as ambient air, has been proposed. However, the validity and reliability of dietary assessment methods have also long been debated and challenged, with most studies suggesting the validity of this approach, in contrast with other studies [18, 22•, 161, 165]. The main advantage of assessing dietary content of the metalloid is the possibility to assess intake of selenium species independently from their subsequent metabolism and excretion in the body, known to be influenced by individual characteristics and other factors. Conversely, this approach is limited by the variability of selenium in foodstuffs over space and time [18, 22•, 144, 169], by the difficulties in assessing dietary habits, and by limited knowledge of selenium species and their bioavailability in foodstuffs [6, 168, 201]. The assessment of selenium exposure due to ambient air pollution would also be an attractive approach, but still very limited evidence is available for its feasibility and reliability, though tobacco smoking and outdoor air pollution due to motorized traffic or coal combustion appear to be sources of selenium exposure, the latter being potentially linked to adverse health effects [7,8,9].

Risk Assessment of Selenium: Facts, Uncertainties, and Challenges

The aforementioned uncertainties about the health effects and suitable biomarkers of selenium exposure explain why the standards for selenium exposure, with reference to both adequate daily intake and upper limits of intakes, are inconsistent across different countries and agencies. A summary of recommendations from various authorities is given in Figs. 2 and 3. Figure 2 reports the comparative analysis of the environmental standard and the nutritional recommendations, both in terms of recommended dietary intakes and of the LOELs (lowest observed effect levels). In addition, it shows the levels at which the human studies, both nonexperimental (environmental) and experimental, have found adverse effects in humans, and applies them an uncertainty factor of either 3 or 10 to derive safe upper limits of selenium intakes.

Fig. 3
figure 3

Selenium drinking water standards for human consumption issued worldwide [26, 57]

The overall picture shown by the comparison of these figures is the variability of the current standards concerning recommended dietary intakes, with consequent implications on the assessment of the safety of selenium exposure in a substantial part of the world population, to avoid both deficiency and excess of this element. In addition, the comparison between the safe upper limits of selenium exposure suggested by the most recent epidemiologic studies (applying an uncertainty factor to the LOAELs) and the upper limits identified in even the most recent assessments show their inconsistencies and call for their reassessments. In addition, this further highlights the potential pitfalls of using a proteomic approach based on selenium-driven selenoprotein upregulation when assessing selenium adequacy.

In addition to the conflicting results and uncertainties arising from the aforementioned patterns, Fig. 3 shows the different drinking water standards adopted worldwide [26, 57]. Variations in the standards for water human consumption are large, showing a factor of 50 between the lowest one (that applied in Russia, 1 μg/L) and the highest one (issued by EPA, 50 μg/L). The European Union and the French ANSES standard (10 μg/L) are at the lower range of the distribution. However, most of these standards have been based on a clearly inadequate assessment of human data, and there is a concern for the health effects of selenium exposure around 10 μg/L and above [37, 57, 202, 203]. Though unusually high levels of selenium in underground and drinking water may occur throughout the world and are being increasingly detected [57, 68, 203,204,205,206,207,208,209], the number of individuals exposed to high selenium levels through drinking water is unknown, also in the USA and the European countries. This mainly depends on the still limited information about distribution of selenium levels in underground and tap waters, also taking into account that such distribution may be uneven across different wells and locations even within small areas [57, 93]. Such situations deserve to be further investigated both in order to gain insight into the disease risks of this element through drinking water, and to protect individuals at risk of selenium overexposure through this source.

More generally, as far as selenium exposure limits and recommendations for both diet and drinking water are concerned, available evidence suggests that more conservative standards should be considered [26, 57]. Finally, we believe that an in-depth assessment of the underlying scientific evidence is required, also taking into account the different selenium species and their potential effects on human health.

Conclusions

Based on epidemiologic studies and particularly on the high-quality human data recently generated by the trials, we recommend a comprehensive and updated assessment of the safety of both deficient and toxic exposure to selenium species supported by an in-depth review of the biochemical and toxicological literature. Such an assessment should be done in the light of recent literature emphasizing the toxic and pro-oxidant properties of the various chemical forms of selenium [31, 190, 191], which raises questions about using selenoprotein upregulation to assess adequacy of selenium intake [3•, 26]. Particular attention should be given to the recent epidemiologic evidence indicating adverse effects of low-dose selenium overexposure [26, 34•, 35, 54•]. A comprehensive assessment of the health effects of deficient and excess selenium exposure should also focus on neurological disease, in addition to other diseases, taking into account the most recent epidemiologic and laboratory studies, and the potential involvement of genetic factors [4•, 33, 36, 53, 54•, 58, 91, 97, 98, 101, 167•, 210,211,212,213,214].

Overall, such a health risk assessment may lead to an advancement of our knowledge of human health effects of selenium, to a more adequate risk assessment of selenium exposure and to an improvement and harmonization of the conflicting standards and reference values recommended worldwide. This may allow scientists and public health professionals to identify even subtle conditions of deficient and excess exposure, thereby ensuring the safety of human exposure to selenium compounds globally.