Introduction

Today there is no doubt that ionising radiation at absorbed doses of around 0.2 Sv or more, given on a single occasion, are carcinogenic. There is, however, an ongoing debate about the effects of lower doses on human health, especially if they are also protracted. Some investigators propose that hormesis, or beneficial effects, exists at low doses whereas others cite evidence for harmful effects. Small protracted doses are commonly received by those who are occupationally exposed to ionising radiation, including the staff at nuclear medicine departments. For the nuclear medicine patient, the absorbed dose is usually in the order of or less than 0.01 Gy and is delivered over a period of some hours up to a few days [1, 2]. Radiological protection regulations and practice are based on the recommendations of the International Commission on Radiological Protection (ICRP) [3], who assume that at low doses there is a linear relationship between dose and risk, all the way down to zero, the so-called linear no-threshold (LNT) hypothesis. Risk in this case means more specifically the probability of cancer induction. The LNT hypothesis has become the dogma from which a radiation protection policy has been developed [4, 5, 6].

A competing alternative to the LNT model is offered by threshold models [7, 8, 9], whereby it is assumed that no increase in risk occurs at doses below a certain level. A variant of such a model is the hormesis model, according to which low exposures to radiation are beneficial to health, even if larger doses may be harmful. The hormesis model is valid for some chemical substances and trace elements which the body needs in small amounts to stay healthy, but which may become deleterious when received in larger doses. Proponents of radiation hormesis models base their argument on evidence from two different areas [10, 11, 12]. The first is epidemiology, where there is a lack of proof for the LNT model combined with a number of studies indicating beneficial effects of radiation. The second major argument for the hormesis model is that cells irradiated in vitro with low absorbed doses, a few tens of mGy, show less damage as a result of a subsequent exposure within hours than do unirradiated cells [13, 14]. This is the "adaptive response to ionising radiation", a phenomenon that is sometimes considered to indicate hormesis, but which is in fact neither synonymous with, nor should be confused with, hormesis.

Numerous studies have shown adaptive responses of specific biological mechanisms to low doses of ionising radiation. Adaptive responses have been observed with different types of endpoint, such as cell survival or chromosomal aberrations. It has also been demonstrated that a low dose of gamma rays may increase resistance to other DNA-damaging agents [15], and that it can reduce neoplastic transformation in vitro to a level below the spontaneous rate [16, 17].

The hormesis theory has long been banished to the back-yard of science, and the unexpected results reported have been either ignored or regarded as artefacts of the experiment. During the last 5–10 years, however, the theory has experienced a renaissance, as reflected in the scientific debate. In 1995 the question of hormesis was described as the "issue of the decade" by an editorial in this journal [18]. Another indication of the renewal of the debate was the inclusion of a chapter devoted to the adaptive response to radiation in the 1994 UNSCEAR report [13]. However, evidence for an adaptive response in terms of human health is still controversial, and even if the debate has been brought out from the shadows, hormesis has never been embraced by the radiation protection community (see the analysis by Mossman [19] for further discussion of the reasons for this).

The aim of this paper is to critically review the arguments expressed in favour of radiation hormesis against the background of the prevailing risk philosophy on which the current regulatory systems are based. To provide a full picture, the implications of various recent radiobiological findings for low-dose radiation effects are considered.

The linear no-threshold hypothesis: its foundations, use and weaknesses

The LNT model is a paradigm that has been found useful and practical for establishing dose limits and regulations for radiation protection purposes. For a presentation of the origin and development of this paradigm, see the review by Kathren [5]. Together with the assumption that the risk for radiation-induced cancer is proportional to the natural cancer incidence, the LNT hypothesis constitutes the basis from which the ICRP has derived its present estimate of risk for fatal cancer or hereditary disease, namely 0.05 Sv−1 [3]. The uncertainty in the derived risk figure is substantial, and the risk depends strongly on the age of the exposed individual and a number of other less known factors. By using this risk coefficient, small doses may be transmuted into hard numbers relating, for example, a specific radiodiagnostic procedure to a number of cancer deaths in a population.

Epidemiological studies on human populations play an outstanding role in the assessment of the risk from radiation [20], and current risk estimates and hence also the radiation protection regulation systems are based on such quantitative studies. Of special importance is the population of A-bomb survivors, i.e. the population exposed in Hiroshima and Nagasaki [20, 21, 22, 23, 24, 25, 26]. This cohort, which has now been followed for half a century, has been subjected to exhaustive and detailed statistical studies in order to obtain a quantitative estimate of the risk from exposure, and at present provides the main source of information on radiation risks for humans. It derives much of its strength from the fact that it covers a wide dose interval and contains a very large number of individuals exposed at low doses. However, at such dose levels the risk for cancer induction is very small compared with the natural cancer incidence, and a very large cohort is thus needed for a significant estimate of the risk. The additional number of cancers due to radiation may be smaller than that due to other risk factors which may correlate with the exposure. For a review on the epidemiological evaluation and statistical power required, see, for example, the UNSCEAR 2000 report [27]. The A-bomb survivor cohort amounts to around 90,000 individuals with an absorbed dose varying from zero to a few Sv. Applying the "official ICRP risk" for fatal cancer of 4% Sv−1, an absorbed dose of 0.2 Sv to 10,000 individuals will result in 80 extra cases [3]; however, since the expected normal rate in a group of this size is about 2,000, very high accuracy is required in the analysis, and especially in the choice of control group, in order to demonstrate a statistically significant increase [27].

The A-bomb survivor study has been supplemented by observation of medically exposed groups of varying size [28, 29, 30, 31, 32, 33, 34]. These populations have been exposed to a comparatively high dose. In medically exposed individuals a specific site is often irradiated, resulting in a rather high absorbed dose [24]. Results from these studies usually conform with those from studies on the A-bomb survivors. In addition, people have been exposed to ionising radiation as a consequence of accidents in Chernobyl and Mayak in the former Soviet Union [35]; studies of these populations are not considered further in the present review.

No effort has been spared in using the A-bomb survivor data to assess dose-response models at low doses and thereby to resolve the uncertainty about the quantitative effect of ionising radiation doses below 200 mSv. Pierce and Preston have recently used unconventional methods to re-evaluate the cohort of A-bomb survivors [24] with the aim of estimating cancer risks from exposures below 500 or even below 200 mSv. About 75% of the survivors received doses of less than 200 mSv. This study is important since it may be considered the first successful attempt to obtain a more accurate estimation of the risk from exposure to low doses. Even if the resulting dose-effect curve shows some minor variation from linearity, it still supports the LNT model. An upper confidence limit on any possible threshold is about 100 mSv or somewhat lower, depending on how the neutron dose is calculated, and the smallest dose showing a statistically significant risk is just above this level. Another important study designed to establish the shape of the dose-response curve was carried out by Chomentowski et al. [26], who performed a model-free visualisation of the data. Their analysis showed linearity for solid cancers, but for leukaemia it revealed a curvature upwards and almost a threshold, which is remarkable since the response at larger doses is higher for leukaemia. Similar observations have been reported by other investigators [21, 36, 37]; however, they interpreted them as being more or less due to an artefact, and none drew the conclusion that there is evidence for a threshold for leukaemia induction. In addition to human epidemiological studies, mice irradiated in the laboratory have been utilised to investigate carcinogenesis at low doses [38], but this study added little knowledge since the cancer frequency was indistinguishable from the background frequency below a certain dose level.

Even if present epidemiological studies fail to verify the LNT hypothesis, it has been judged by the ICRP that a linear-quadratic model is the most plausible and reasonable approximation of the dose–risk relation at low doses. This presumes a linear relation at low doses, but at higher absorbed doses (>1 Sv) and absorbed dose rates the quadratic term becomes more significant. This has been accounted for by the ICRP by assigning a so-called dose–dose rate effectiveness factor (DDREF). The ICRP assumes that the risk per unit absorbed dose at high doses is a factor of 2 larger than at small dose values, that is DDREF equals 2 [3]. It is important to bear this in mind when discussing observed differences in radiation sensitivity between data representing smaller or larger exposures.

The major advantage of assuming a linear relationship without a threshold is that proportionality between the risk and the exposure from different sources is achieved, i.e. the risk from each source or practice can be assessed without the need to take other exposures into account. On the other hand, if the hormesis theory can be justified, it might present an attractive solution to many single radiation protection problems, as seen from the individual worker's perspective.

There are several reasons for criticising the LNT model. One is that it claims that "no dose, however small, is safe"; consequently it also indirectly implies that mammals have no defence against effects that injure DNA, or that any such defence mechanism is not active at very small doses. As is known from radiobiology, this is not true. Furthermore, for very small doses to a large population, e.g. doses in the order of 1% of the natural background level, which for an individual would be considered trivial, strict application of the LNT model could have economic consequences that some people might judge unreasonable [39]. In view of this effect, and the high uncertainties in the estimated risk for small protracted doses, the ICRP has recently started to review its policy [39, 40, 41], and new recommendations are to be expected in a few years. These recommendations will not alter the basic model, merely the application of it in some situations. Another reason for critique is that today more and more knowledge is being gained on the biological mechanisms underlying the carcinogenic process. One motive for assuming a more complicated dose-effect relation at low doses is that no single factor alone causes cancer. The carcinogenic process is understood to consist of multiple steps, each one of which can include multiple mechanisms [42, 43, 44]. There is a high degree of complexity in the mechanism of cancer induction [45], and factors other than the initiating event also affect the cancer risk.

In radiation protection it has long been a dogma that the carcinogenic effects of ionising radiation are induced by direct or indirect damage to the DNA in the cell nucleus. However, it has been suggested that the principal effect of radiation lies in modifying the biological defence mechanisms rather than in providing initiating events [46]. The motivation for this view is that a very large number of DNA-damaging events occur spontaneously every day in each cell of our body; the resultant damage is mostly repaired but, as suggested by Pollycove and Feinendegen [47], approximately one alteration per cell per day persists. It is estimated that the number of radiation-induced mutations from normal background is 107 times lower than the spontaneous rate due to the metabolism [47]. Together with recent radiobiological findings indicating that not only nuclear DNA but also other constituents of the cell may have a role to play in this context, this implies that the LNT model might predict an unrealistically simple relationship. On the other hand, a simple dose-response relationship may disguise competing processes that have different dose dependencies, and the LNT model is not based on epidemiology rather than radiobiology. The scientific validity of the LNT model has been challenged from the radiobiological point of view by many scientists over the past decade [6, 17, 48, 49, 50]. Some scientists claim that there are now sufficient data to indicate that the LNT theory is overly restrictive and not correct for all cancers or for some types of cancer in the low-dose region, especially for protracted exposures; its appropriateness as a basis for radiation protection in general has also been questioned [7, 51].

The hormesis model

Early studies performed about 100 years ago into the response of different plants to radiation often showed that radiation, usually X-rays, had a stimulating effect on plant growth [52, 53]. The reader is referred to Calabrese and Baldwin [53] for a comprehensive review of the large number of experiments on the effects of irradiation of different biological materials (mainly plants, but also fungi, algae, protozoans, insects and larger animals including humans) that were performed in the early 1900s.

There is currently a lack of consensus in the scientific society on the definition of hormesis. However, in a general sense it is usually understood as the induction of beneficial effects by low doses of an otherwise harmful physical (e.g. ionising radiation) or chemical agent. The term "beneficial" may be used with reference to various effects, and even only increased resistance to a subsequent high exposure may be designated a hormetic effect. Of special interest in this context is the theory which predicts that small doses of ionising radiation lead to a reduction in the natural incidence of cancer in a population. To obtain the full picture, however, the beneficial effects may be considered to include not only reduced cancer incidence but also (a) lower mortality due to non-cancerous diseases and (b) stimulation of growth and fertility. In an attempt to introduce a scientifically based definition, it has recently been suggested that the term "hormesis" should be applicable to those adaptive responses that are characterised by biphasic dose-response relationships, without reference to any associated beneficial or harmful effects [54].

In parallel with the epidemiological observations interpreted as beneficial effects of irradiation, support for the hormesis theory may be based on ecological and evolutionary considerations [55]. Ionising radiation has always been a part of man's natural environment. Exposure to such radiation, and to different natural chemical agents present in our environment, results in the continuous production of a large number of free radicals in our bodies. In addition, free radicals are produced in the cells as a consequence of the metabolism. The presence in cells of anti-oxidants and other intermediate reactions [47, 56, 57] reduces the number of radicals, and thus the harm to the DNA is considerably less than it would otherwise be. These defence mechanisms are an evolutionary adaptation of the organism to its habitat, and as a consequence the body's response to damaging agents in our natural environment should be optimal at normal background levels [55, 58]. A central tenet in the hormesis theory is that increasing amounts of such agents above normal background levels will stimulate the defence mechanisms by increasing the production of free radical scavengers and DNA-repairing enzymes. For doses within certain limits the net result will be less damage to the cell and consequently to the complete organism. Another expression of the hormesis effect is that otherwise deleterious agents, including radiation, may in small amounts stimulate the immune system; overall, this will increase the likelihood of a longer and healthier life.

Many chemical substances that are essential for the body or have a stimulating effect in trace amounts become toxic at larger concentrations, and may have detrimental or even lethal effects [59]. This fact is sometimes presented as an argument for radiation hormesis as well, but it should be observed that the way in which such substances interact with the organism is usually much more complex. The picture in respect of radiation hormesis may also be more complex than first appears. For instance, in an experiment on the incorporation of an alpha emitter in mice, it was observed that a significant number of bone tumours were induced while the lifetime of the surviving mice increased [60]. Thus hormesis may be demonstrated in one tissue simultaneously with an increased cancer risk in another.

Epidemiological studies suggesting a hormesis model

Numerous published epidemiological studies are not consistent with the LNT hypothesis. A limited number of these advocate a hormesis model; they encompass different types of cohorts and their conclusions are based on different statistical power. A large number of reviews found in the literature present such studies [10, 11, 12, 46, 47, 61, 62]. The cohorts referred to may have been subjected to elevated background levels, or comprise workers in the nuclear industry or in medicine who have been exposed to radiation [12]. Furthermore, there are reports claiming that data from atomic bomb survivors indicate hormesis for low doses [63, 64]. However, these reports are rare, and there are opposing studies such as that by Cologne and Preston [65] on the longevity of atomic bomb survivors, who reported shortening of life span as a result of irradiation. In general it must be said that as for epidemiological data, the data indicating hormetic effects are weak and inconsistent and are subject to large statistical uncertainties [19]. In addition, some of the studies are based on re-evaluation of earlier published epidemiological data.

Cohorts of occupationally exposed individuals

Studies of cancer risk among workers in the nuclear industry should be of great value when estimating risk from low and protracted doses [66]. They constitute a stable population group and offer well-documented exposure data, obtained with the aid of personal dosimeters and monitoring of internal contamination. The number of nuclear workers for whom a sufficiently long period of follow-up is available is, however, small. In most of the reports the majority of workers have still been relatively young at the end of follow-up. Furthermore, studies of population of workers are confounded by a number of factors, one of which is the so-called healthy worker effect, which describes a selection process in which those who gain employment and remain employed are healthier than those who do not work [67]. This effect may be dealt with properly by choosing a reference group of similar workers who are not exposed to radiation.

Two occupational cohorts that have often been cited as justifying or supporting the hormesis theory are U.S. nuclear shipyard workers and British radiologists. The largest study involved a total cohort of almost 71,000 nuclear shipyard workers from seven shipyards who were employed between 1952 and 1977 [68]. The aim of this study was to evaluate thoroughly the possible risks from occupational exposure among these workers. The results of the study were only published as a report [68]. Two other studies of 24,545 of these workers at one of the naval shipyards, in Portsmouth, New Hampshire [69, 70], refuted a suggestion that there is excess risk of leukaemia among these workers as a result of occupational exposure to ionising radiation. For the subgroup of the original report that received the highest absorbed dose, which comprised around 28,000 persons, the dose was approximately 5–10 times their cumulative dose from background excluding radon. Employees at the same workplaces who were less exposed served as a control group; therefore the "healthy worker effect" should not have applied. Even if this is the largest cohort of nuclear workers studied, it was estimated that the probability of detecting an excess of leukaemia at the level which could be estimated from "official risk figures" was less than 20%.

According to Cameron [12], an important finding not included in the Matanoski report [68] was that the group of shipyard workers who received the highest cumulative radiation dose had a cancer death rate that was more than four standard deviations lower than that in the control group. Another important neglected finding, again according to Cameron, was that the same high-dose group showed a death rate from all causes that was 24% lower (corresponding to 16 standard deviations) than that of the control group. Cameron offers no explanations for these findings other than that they support the theory that radiation doses at this level (5–10 mSv/year) are beneficial for health. What might seem strange in this context is that no other follow-up of this study is found in the published literature, and the result pointed out by Cameron remains to be confirmed or refuted.

Numerous other studies of occupationally exposed nuclear workers have been performed. In general they have observed a slight or no significant increase in cancer risk, and often the exposed groups show better health in general, which is presumably attributable to the healthy worker effect. For a review of these studies the reader is referred to publications by Cardis et al. [66, 71].

The second study to which reference is commonly made is that on British radiologists practising between 1900 and 1980 [72]. Radiologists employed before 1921 were assumed to have received larger radiation doses, and in this group a significantly increased cancer death rate was demonstrated. However, the total death rate was not found to deviate significantly from that in the control groups. This finding has been interpreted by other authors as showing that stimulation of the immune system "cancelled the radiation induced cancer deaths" [12]. The cohort has since been followed up, and a recent publication encompassing 100 years of observation [73] shows the same result. Another study of the risk for breast cancer mortality among radiological technologists in the USA has demonstrated a significant risk in women employed prior to 1940 [74]; this paper may also be referred to for a review of other similar studies.

Among a group of radium dial painters who were occupationally exposed to internally deposited alpha-emitters, the best fit to the data on bone tumour induction was obtained with a threshold model [44]. This indicates that other mechanisms are involved for this tumour type.

Studies of populations exposed for medical reasons

Numerous epidemiological surveys of mortality and risk after diagnostic or therapeutic exposure of patients have been performed. Often the result of these studies is in concordance with findings in the atomic bomb survivors [28, 75]. However, an important exception seems to be lung cancer after exposure to X- or gamma rays. Rossi and Zaider [76] have reviewed the published studies in this field, the most relevant of which are two performed in patients who underwent fluoroscopy [32, 77, 78]. Their conclusion is that the risk for lung cancer after exposure to photons is significantly lower than that predicted by the ICRP LNT model in all cases. For doses around 1 Gy to the lung there even seems to be a beneficial effect.

Indications for beneficial effects of higher natural background radiation levels in comparative epidemiological studies

The normal absorbed dose from natural background radiation is approximately 1 mSv/year, excluding inhaled radon daughters. A number of areas world-wide show an elevated background radiation level up to several tens of mSv, or even considerably more if radon is included. This presents an opportunity for epidemiological studies on the effect of small doses, often of the same order of magnitude as the radiation protection limits recommended by the ICRP.

None of the epidemiological surveys of populations living in areas with a high background radiation has observed increased cancer mortality compared with a control population in an area with a normal background radiation level. Extensive epidemiological surveys on populations living in Kerala, India [79, 80, 81], Iran [82], China [83] and the USA [84, 85] have been presented in the literature. In one of the most studied areas, Kerala, no detectable increases have been identified in total death rates or the frequency of chromosomal aberrations or congenital malformations. In Iran, Ghiassi-nejad et al. [82] have studied how the high background influences radiation sensitivity, as reflected by chromosomal aberrations (see the section on radiobiology that follows). In the USA, three Gulf Coast states have been compared with three Rocky Mountain states with three times higher natural background radiation levels (with radon included the difference in background is even larger). It was found that after age-adjustment, the cancer death rate in each of the three Gulf Coast states was higher by an average of 1.26 than that in the three Rocky Mountain states [84]. The negative correlation is even more pronounced if only cancer of the airways is considered, for which radon plays the major role. These findings may be attributable to confounding factors and large statistical variations, but the results seem too distinct for this to be a plausible explanation.

Radon

A large fraction, and in many countries the largest fraction, of the radiation exposure from natural sources that is received by man comes from inhalation of radon daughters. Radon is an inert radioactive gas, the progeny of decaying uranium minerals in rocks and soil. Since it is a gas, it is released from the mineral matrix, can migrate towards the surface from its site of creation and will accumulate in enclosed areas, e.g. dwellings or mines. Decay products from radon, the "radon daughters" are also radioactive and most of them emit alpha particles. Inhalation of these will result in exposure of the airways and lungs to alpha particles causing damage to the epithelial cells. The mechanism by which alpha radiation is carcinogenic is probably different from that of gamma radiation (which is the type of radiation normally used in diagnostic radiology). The risk for lung cancer is, however, well documented in approximately 20 studies of radon-exposed underground miners [86, 87, 88, 89]. Whether a carcinogenic effect is also observed from radon in homes is, however, less clear, and findings have even been contradictory. There has been a great deal of concern about the public health consequences of such radon exposure, and there are several published epidemiological studies of large population groups who have been living in houses with elevated radon concentrations [86, 90, 91, 92, 93]. The exposure level from radon in dwellings is normally much lower than that in mines, and epidemiological studies have not yet convincingly demonstrated an excess risk of lung cancer in those so exposed. Furthermore, the interpretation of the epidemiological data for miners is complicated by confounding factors, especially smoking, but also other environmental factors such as diesel exhaust and asbestos [94, 95].

In addition to the studies on miners, numerous ecological epidemiological studies have been performed in attempts to examine the association between residential radon exposure and lung cancer [85, 90, 96]. In ecological studies, geographically based lung cancer rates are compared with the mean radon concentrations in the areas under consideration. Ecological study design relies on summary measures, and has major limitations since an individual's current or retrospective radon exposure cannot be assessed. [96]. Among the limitations are confounding factors such as smoking habits and occupational factors. Synergistic effects between residential radon and smoking, including passive smoking, have been observed [95, 97], and studies designed to estimate risk for non- or never-smokers have failed to find a positive association [97, 98].

Numerous ecological studies have been published in the literature, 15 of which have been reviewed by Stidely and Samet [96]. In about half of them there was a positive association between radon and lung cancer. One ecological study indicating a highly significant negative dose–risk relationship was published by Cohen [85, 99, 100]. His study was designed to test the LNT theory and it was the first to suggest that radon in dwellings has beneficial effects. The result has been called "Cohen's paradox", and it has been the subject of an extensive debate on the effects of residential radon [89, 101, 102, 103] and on the validity of the LNT model in general. The main criticism raised in the debate concerned the underlying epidemiological methods, and specifically whether all forms of confounding factors, and especially the overwhelming factor of smoking, had been taken into account in the analysis of the data [104]. Because of this, many epidemiologists think the result is anomalous. A conclusion has still not been reached regarding this conundrum, and residential data on radon and its effects are often considered a strong indication that irradiation of the lungs will have a hormetic effect.

Radiobiological mechanisms and the dose-response relationship

Adaptive response

The continuous production of free radicals from radiation and other sources has stimulated cells to evolve a repair system for chromosome breaks. An alteration of the DNA molecule triggers the repair system, and frequent activation may increase the general repair capacity, irrespective of the cause of the damage. Such a radiation-induced "adaptive response" has been convincingly demonstrated in a variety of cultured cells, and the 1994 UNSCEAR report devoted an annex to this subject [13]. In short, the conclusion of this annex was that an adaptive response at the cellular level is a fact, at least for some specified cell lines. It has been demonstrated that by exposing cells to an absorbed dose of the order of 1–50 mGy delivered in a short time, the number of radiation-induced chromosomal aberrations caused by a subsequent acute dose in the range of 1–3 Gy is reduced [13, 105, 106]. Many such laboratory studies have been performed, and a large number of different cells of different origin, including human, have been utilised [107, 108, 109, 110, 111, 112, 113]. In these experiments the radiation sensitivity has usually been expressed in terms of the frequency of double strand breaks. The results of these experiments are in accord with the recent observation by Ghiassi-nejad et al. [82] that lymphocytes taken from people living in a high background area in Iran (up to 260 mSv/year), and irradiated with X-rays to a dose of 1.5 Gy, had only 55% of the chromosomal aberrations observed after the same dose to lymphocytes from residents in normal background areas.

The adaptive response in human lymphocytes is characterised by a large individual variability, as observed in various experiments [114, 115, 116]. A plausible explanation for this could be variations in the ability to adapt during the cell cycle [115, 117], which would parallel variations in radiation sensitivity with the phase of the cell cycle.

The biological mechanisms underlying the adaptive response are rather unclear. It seems that a certain number of specific types of DNA lesion need to occur within a fixed time. Robson et al. [118] have isolated a novel gene that may play a role in induced radioresistance. For a more general review of the molecular mechanisms underlying the radio-adaptive response, the reader is referred to Sasaki et al. [119]. The picture is further complicated by another effect, hypersensitivity to very low radiation doses [120, 121]. Low acute radiation exposure or exposure at very low dose rates was found to be more effective in causing DNA damage per dose unit. It has been suggested that the adaptive response and the hypersensitivity are different manifestations of the same underlying mechanisms.

UNSCEAR [13] reported that evidence for an adaptive response in human populations has so far been neither clearly demonstrated nor refuted. The question remains as to how to extrapolate the observed adaptation effects obtained from in vitro irradiation of cells to the probability of induction of a cancer cell, and consequently the risk for cancer induction in an organ. Another question is, How long is the duration of the adaptive response? Is it in reality efficacious only for chronic exposure? Feinendegen and Pollycove [50] have suggested that the experimental data indicate a dual cellular response to low-dose irradiation: one part causes DNA damage while the other part signals to stimulate the mechanisms that control DNA damage (e.g. repair or apoptosis). Below a certain dose level (≈200 mSv), the number of DNA alterations caused by other factors exceeds the number caused by radiation, and thus there is a possibility that stimulating the control mechanisms by irradiation will result in less DNA damage than would occur without radiation. This would explain the mechanisms behind the hormesis model. Sasaki et al. [119] state that for mammalian cells the optimal dose for a radio-adaptive response is below 100 mSv.

Effects on growth and life span

It was demonstrated very early that ionising radiation might have a stimulating effect on plant growth [11, 52, 53]. Stimulated growth or proliferation has been observed in vitro [122], even if, according to UNSCEAR [13], this phenomenon has not yet been convincingly demonstrated under chronic exposure. Also, irradiation of mice in vivo produced similar results in haematopoietic cells [123].

Employing a different approach to study the importance of normal levels of background radiation, Planel et al. [124] found that background shielding significantly reduced the proliferation of a protozoan and a cyanobacterium. The cell cultures were shielded from the normal background using lead, and it was also observed that introduction of a weak radiation source inside the lead cave resulted in restoration of the growth rate. A similar experiment with mammalian V79 cells has been reported by Satta et al. [125]. They did not observe the same significant decrease in growth rate for the shielded cells, but they noted differences in the expression of anti-oxidant enzymes between the two cell cultures, in accordance with the hypothesis that environmental radiation may modulate the cellular metabolism. It has been estimated that background radiation of 1 mSv/year produces 0.005 DNA alterations per cell per day, which, after enzymatic repair and further reduction by apoptosis and immune system removal, results in about 10−7 radiation-induced mutations cell per day [47].

Support for the hormesis theory, as far as the life span of organisms is concerned, has been provided by Caratero et al. [126]. They irradiated 600 mice continuously with gamma rays to 25–50 times the normal background level, and found that life span was significantly increased in these mice as compared with the non-irradiated control group. This finding supports the observation by Müller [60] that mice with incorporated alpha-emitters lived longer than a control group, provided they were tumour free. It also increases the credibility of some of the epidemiological observations concerning longevity. Data for humans, however, are contradictory in this respect. In addition to a few epidemiological studies reported above, a subcohort of A-bomb survivors from Nagasaki who had been exposed to 0.5–1.5 Sv and were still alive in1970 has been investigated. A Japanese group [63, 127] reported a significantly lower mortality from non-cancerous diseases in male subjects in this group. This result was not supported by a later study of a larger group of survivors by Cologne and Preston [65], who reported an overall decrease in the survival rate as a result of the exposure. A comprehensive study of non-cancer mortality among the atomic bomb survivors, by Shimizu et al. [128], showed a significant increase in non-cancer mortality for the whole exposed group, but the data were statistically consistent with curvilinear dose-response functions positing essentially zero risk for doses below 500 mSv. These contradictory results can probably be explained by the choice of control group, which is crucial for the data analysis.

Stress-derived radiation hormesis

Taking the universality of a stressful environment into account, all people are exposed to abiotic stress in the process of adapting to a variety of environmental agents [55, 58]. Hormesis in this context derives from adaptation of metabolic reserves to extremes from environmental stresses through evolutionary time, and ionising radiation is one component of this environment. Hormetic effects of mild stress of varying nature have been reported in the literature [129]. These effects are expressed as longevity or increased resistance to different types of stress agent. Provided that there is no fundamental distinction between stress from irradiation and from other agents, this may be taken as an indication for the radiation hormesis model; this is, however, an important qualification, and it remains to be confirmed that no such distinction exists. A complicating factor in interpreting data or radiation effects may also derive from stress in a general sense; thus, Boreham et al. [130] have demonstrated stress-induced radiation resistance in yeast. Another effect worth noting, even if it is hard to elucidate how it affects the risk at low doses, is that the presence of stress-induced proteins has been demonstrated during chronic exposure to 2 mGy/h [13] as well as at low acute doses (20–500 mGy) [118, 131].

Stimulation of the immune system and multifactorial diseases

Observations of changes in the immune system after exposure imply that radiation might play an essential role in the immunocompetence of the living organism [13, 132]. It has been demonstrated that the responsiveness of the immune system to infection by common pathogens is impaired in heavily exposed A-bomb survivors [132]. In the ongoing debate about the effects of low doses, the opposite claim has been made, i.e. that low doses stimulate the immune system, leading to beneficial effects with regard to other diseases and causes of death other than cancer [11, 133]. The immunological consequences of chronic low-dose irradiation (average 25 mSv/year for 8 years) of residents of radioactive buildings in Taiwan were studied by Chang et al. [134]. It was concluded that for this exposure situation there was a significant depression in the CD4+ lymphocyte count, while the other lymphocyte populations were not significantly affected. In reality, the degree of the effect of irradiation on the immune system may vary depending on which type of function is studied. Increasing life spans, as demonstrated by epidemiological studies and animal experiments (se above), may be explained by the effects of radiation exposure on the immune system.

It has recently been recognised that naturally occurring multifactorial diseases might, theoretically, be induced by ionising radiation. The ICRP has therefore recently published a report reviewing relevant data on multifactorial diseases [135]. This question has not been raised specifically in the hormesis debate; however, a risk for these diseases will tend to result in shortening of the life span.

Bystander effect and genomic instability

The hormesis effect, defined as a biphasic dose-effect relationship, arises from the interaction in the cells between molecules affected by exposure to different cancer-inducing and/or -promoting agents. During the last 10 years, evidence has been accumulating to challenge the dogma that the carcinogenic properties of radiation rely mainly on initiating damage to the DNA; this evidence pertains to the existence of two related phenomena, the "bystander effect" and "genomic instability". Recent technical developments have made it possible to selectively irradiate a single cell with a micro-beam of alpha particles, or some other densely ionising radiation [136, 137]. With this technique it is also possible to irradiate only a part of the cell. A number of experiments with such equipment have demonstrated that effects of relevance for cancer induction and development can occur when only the cytoplasm is irradiated [137, 138]: obviously, some targets must exist outside the nucleus, and there must be some mechanism by which damage to a specific molecule in the cytoplasm causes a DNA alteration. Furthermore, non-irradiated neighbouring cells have been observed to show similar changes to irradiated cells[139, 140, 141, 142, 143]; this is the so-called bystander effect. The phenomenon could also be observed when only the cytoplasm was irradiated in a neighbouring cell. How the bystander effect influences the probability of cancer induction is not obvious. It has been suggested that "the radiation risk to low fluences of alpha particles may be higher than we thought" [144, 145], but the opposite opinion has also been voiced [146]. Damage to "strategic targets" in the cytoplasm may be more dangerous than damage to the nuclear DNA, since it is accomplished with little or no killing of the target cells. Especially for an understanding of the mechanisms underlying the carcinogenic effects of high LET radiation, the bystander effect and genotoxic effects on cells, originating from the traversal of alpha particles through the cytoplasm, may be of great importance. The bystander effect has had a great deal of attention, particularly in connection with risk from radon exposure [144, 147, 148, 149]. A better understanding of these phenomena may be crucial in explaining the unexpectedly low risk that has been reported from radon in homes (see earlier section on radon).

A related effect, whereby chromosomal damage is manifested in cells that themselves have not been exposed to ionising radiation but are the progeny of cells irradiated several generations earlier, is known as radiation-induced genomic instability [150, 151, 152, 153, 154, 155]. Such damage may take different forms—increased mutation rate, cell death and increased rate of chromosomal aberrations—all of which are characteristic for ionising radiation. There is a close correlation between genomic instability and carcinogenicity, and the effect has also been demonstrated following irradiation of a bystander [156, 157]. That the phenomenon is not an artefact of growth in vitro has also been demonstrated [156]. The key to the molecular mechanisms underlying the genomic instability and the bystander effect remains elusive but may be hidden in the recent findings that the cytoplasm seems to be an important target for genotoxic effects of ionising radiation [137]. There is evidence that the irradiation might mediate its effects via extranuclear or even extracellular events [157]. Even if genomic instability has been reported to be a quite frequent result of exposure to ionising radiation [150], this will not influence current risk estimates, since epidemiological data already include the contribution from cases in which genomic instability has played a role in cancer induction. It may, however, be of great importance to take this effect into account when extrapolating to lower doses. The frequency of genomic instability after very low doses is unclear, and whether this phenomenon plays a role in promoting a hormetic effect is doubtful. Future research may come up with an answer in either direction.

Conclusions

A number of selected epidemiological studies within the literature provide some evidence for hormesis. In general, however, current epidemiological data do not supply enough evidence to justify a belief that radiation hormesis is a common phenomenon for a wide spectrum of irradiation situations or for a population composed of persons of all ages. Some exceptions are worth noting: Special attention should be paid to studies of populations living in areas with high background radiation levels. Even if there are substantial difficulties in interpreting these environmental data due to uncertainties and confounding factors, they may in the future provide a basis for stronger evidence of hormesis. It is also remarkable that no published data have been found presenting harmful effects from elevated background radiation.

Often the statistical power of epidemiological studies is too weak to demonstrate a significant effect of exposure to ionising radiation. Epidemiological evidence for increased cancer mortality after exposure to low doses is lacking, but recent careful analyses have demonstrated that the LNT model retains some validity down to approximately 100 mSv. For absorbed doses below this level, it is judged by the radiation protection community that the LNT model is the most plausible and relevant. The LNT model is thus the current basis for estimating the risk for cancer induction from ionising radiation, and will probably remain so, at least for the near future.

There are some exceptions, however. Data for leukaemia, bone cancer and lung cancer offer weak evidence pointing to a threshold or even to a hormetic model of dose response.

Adaptive response to radiation is an effect that has been convincingly demonstrated in cultured cells. There is, however, still doubt over how this influences the risk for a multicellular organism, and also over the duration of the radio-adaptive effect. Recent research in radiobiology has yielded many interesting discoveries that may influence the understanding of the effect of ionising radiation at low doses. It is important to note the conclusion that biological effects of relevance to the initiation or development of cancer may be attributed not only to damage to the DNA in the nucleus but also to damage to other sensitive sites in the cell; indeed, the latter form of damage may even be the principal effect of radiation.

If a hormetic effect of radiation exists, it seems to be rather weak and inconsistent, and in such a situation the precautionary principle requires a pessimistic assumption for safety reasons. Overall, there is currently insufficient evidence for radiation hormesis to warrant any far-reaching change in the present radiation protection policy. Nevertheless, the picture is not simple; it should not be excluded that under special circumstances, such as under certain exposure conditions or within particular population groups, limited exposure to ionising radiation may be beneficial with regard not only to cancer induction but perhaps also to other health parameters.