Introduction

Radiation is a form of energy derived from a source that is propagated through material in space. It consists of ionizing radiation (IR) or nonionizing radiation (NIR). NIR has sufficient energy to move atoms around or cause them to vibrate but not enough to remove tightly bound electrons from the orbit around an atom. Examples of this radiation include microwaves and ultrasound waves, and it is also used in magnetic resonance imaging. These forms of NIR are present in our daily lives. Ultrasound waves and magnetic resonance imaging are often used in medical examinations. In contrast, IR has sufficient energy to ionize atoms or molecules by interaction with an atom. It can remove tightly bound electrons from the orbit around an atom and is propagated through space. IR can be categorized as either electromagnetic or particulate energy. Electromagnetic energy consists of γ-rays and X-rays, which can penetrate human tissues; thus, exposure to γ-rays and X-rays can cause serious damage to organs. Particulate energy includes alpha particles and beta particles, which can only penetrate a few millimeters of skin. This lack of penetrating power means that these particles do not cause significant damage to organisms, but they may act as carcinogens or have other adverse health effects when injected or inhaled (Groen et al. 2012; McLean et al. 2017; Christensen et al. 2014a; Marazziti et al. 2012; Philchenkov and Balcer-Kubiczek 2016). Alpha particles and beta particles both are strongly ionizing and can disrupt the atomic structure to produce chemical and biological changes. One alpha particle can ionize 10,000 atoms, and one beta particle can ionize 100 atoms. However, because they put all their energy into ionizing others, they very quickly run out of energy themselves. Hence, alpha particles and beta particles cannot penetrate through much. Gamma rays differ from beta particles and alpha particles, in that they are very poor at ionizing and do not cause chemical and biological changes (Christensen et al. 2014b). IR is widely used in many fields, especially in the field of medical treatment. One of the greatest sources of IR exposure is medical radiation when IR is used as a diagnostic and treatment modality (Miousse et al. 2017b). IR is a ubiquitous environmental stressor and feature of the environment. It is present in our daily lives, originating from natural and man-made sources (Reisz et al. 2014). Indeed, everyone is exposed to low doses of natural and anthropogenic IR every day (Dartnell 2011; Sokolov and Neumann 2015). It causes little acute health effects at the lowest doses, while at higher doses can cause acute radiation syndrome and death (Rezaeejam et al. 2015).

Hazards of IR

High-dose and high-dose-rate acute radiation damage

In recent years, there has been increased interest in the risk of IR, as it can cause significant changes to the components of cells and serious damage to organisms. Epidemiological studies, animal experiments, and in vitro studies have clearly classified the risk of IR (Christensen et al. 2014b; Jin et al. 2010). Substantial attention has already been focused on the deleterious effects of IR on organisms (Dartnell 2011). Animal study has clearly shown that high-dose radiation (HDR) can lead to cancer and a shortened life span, the dose of radiation-induced cancer risks above 50 mSv (Yoo et al. 2014) and may lead to additional ill-defined risk of noncancerous conditions, such as cardiovascular disease (CVD), atherosclerosis, neurodegenerative effects, and cataracts (Marazziti et al. 2012; UNSCEAR 2010).

Occupational or nuclear accident (NC) exposure can have serious health effects, such as at Chernobyl accident (CA) and Fukushima, which had devastating effects on several hundred thousands of people (Miousse et al. 2017a; Zeegers et al. 2017; Saenko et al. 2011). The International Atomic Energy Agency classified CA as “the greatest nuclear catastrophe in human history” (Philchenkov and Balcer-Kubiczek 2016). A number of early emergency workers of CA were found to have developed acute radiation sickness and 28 early deaths occurred (UNSCEAR 2005). In addition to acute illness, many survivors of Chernobyl, Nagasaki, and Hiroshima also suffered leukemia; thyroid, breast, and skin cancers; and cataracts (UNSCEAR 2011; Douple et al. 2011; Shore et al. 2010; Fujimichi and Hamada 2014). Those who survive the acute phase following intense exposure are at risk of sepsis and gastrointestinal and metabolic complications (Christensen et al. 2014a). Long-term monitoring of the impact of Chernobyl’s radioactivity on fauna showed an increased occurrence of tumors and immunodeficiencies, decreased life expectancy, early aging, and changes in the blood and the circulatory system (Tang et al. 2017; Domina 2016). In the aftermath of CA, congenital malformations increased across Europe (Wang et al. 2012; Wertelecki 2010; Sperling et al. 2012). This disaster also altered the human birth sex ratio at the national level across Europe (Scherb et al. 2015). Moreover, in the Fukushima Dai-ichi nuclear power plant accident, significant effects on organisms occurred. The biological effects of IR are related to the physical nature, duration, dose, and dose rate of exposure (Pernot et al. 2012). A study of the atomic bomb (A-bomb) survivors in Japan reported that the risk of mortality of solid cancer increased by 50% when the dose to which the colon was exposed reached 1 Gy, and the risk of mortality from leukemia was quadrupled when the dose to which the red bone marrow was exposed reached 1 Gy (McLean et al. 2017). The risk of blood, breast, and other cancers significantly increased in A-bomb survivors (Ozasa et al. 2012; Goto et al. 2012). Sarcomas can also appear near the original irradiated tumor (Kutanzi et al. 2016). The Life Span Study (LSS) has provided fundamental information on the cancer risk of A-bomb survivors in Hiroshima and Nagasaki. The dose dependence of cancer risk has been confirmed for A-bomb survivors exposed to HDR (Sasaki et al. 2014). Moreover, moderate to high doses of IR are the only established environmental risk factor for brain and central nervous system tumors (Braganza et al. 2012). Another study showed that brain damage and cognitive impairment occurred in mice that received 100 mGy gamma irradiation (Lowe and Wyrobek 2012). Owing to the potential risk of nuclear and/or radiological events, IR is a public health concern, not just a concern for cancer patients (Ryan 2012). Health effects of high-dose and high-dose-rate IR are shown in Fig. 1.

Fig. 1
figure 1

Health effects of high-dose and high-dose-rate IR

Low-dose and low-dose-rate chronic radiation damage

Increasing attention has focused on the deleterious effects of IR on organisms. HDR is known to be detrimental to organisms, but the health effects at low doses of radiation are not adequately understood. Humans are unavoidably exposed to low doses of natural and anthropogenic IR every day. IR is widely used in healthcare, industry, research, and other fields, which increases human exposure (Miousse et al. 2017b).

Nuclear-related radiation exposure: long-term radiation

Those who work in the nuclear industry usually suffer both external and internal radiation exposure. Research has shown that the risk of solid cancer and leukemia among nuclear workers is consistent with the LSS estimated, even if they receive the dose of radiation via accumulation at low-dose rates over many years. In the International Nuclear Workers Study, it was also shown that, at total accumulated doses of nuclear workers below 100 mGy, the risk of solid cancer is consistent with the LSS estimate. Moreover, a range of nuclear medicine workers such as radiologists and radiation technicians have increased risks of a variety of tumors including leukemia, skin cancer, and, for women, breast cancer. The risk of cataracts among medical workers may also increase owing to them often using X-ray imaging to guide interventions. In addition, the risk of lung cancer among underground hard rock miners was also increased, and the risk of lung cancer was found to be related to their exposure to radon gas and its radioactive progeny (McLean et al. 2017). Another study showed that the mortality rate from Alzheimer’s disease of white female radiological technicians was increased compared with that of workers in other occupations (Marazziti et al. 2012). A further study determined that the mortality from mental disorders of female workers at US nuclear weapon plants was increased (Sibley et al. 2003). Studies of those employed within the nuclear industry also showed that the risks of leukemia and solid cancer were increased, even when the cumulative dose of nuclear industry workers was less than 100 mSv and the dose rate was less than 10 mGy per year. Moreover, an epidemiological study has highlighted the detrimental health effects of exposure to low-dose and low-dose-rate IR (Hall et al. 2017).

Medical radiation exposure: radiotherapy and medical diagnosis

Humans are exposed to IR in every walk of life owing to its diverse use, from medical diagnostics to industrial applications. IR is a component of our environment and an important tool in medical treatment (Miousse et al. 2017a). Human exposure to IR is unavoidable; for example, in computed tomography (CT) scanning, which is a commonly used diagnostic tool in medicine, humans may be exposed to low-dose radiation. Concomitant medical and surgical conditions can also result in increases in morbidity and mortality (Leng et al. 2015). For the patient, each procedure involves relatively high radiation exposure, implying that interventional cardiologists also undergo significant professional exposure when near the patient and the radiation source (Gerber et al. 2009; Vano 2003). Each operator performs hundreds or thousands of procedures each year, so the cumulative dose of the operator is not negligible. Many procedures entail radiation exposure to both patient and operator. The high levels of IR exposure to the patient and treating staff are a major social and scientific problem (Food and Drug Administration 2010). Patient exposure to IR has increased significantly owing to the use of medical imaging having expanded greatly; this has raised concerns about whether IR can cause cancer among patients (Thaker et al. 2015). There has been increased concern about the exposure of patients and operators given the accumulated knowledge about the hazards of IR (Hadelsberg and Harel 2016).

And, there is a large body of references about beneficial effects of ionizing radiation exposure. IR is not only used in medical diagnosis but also in radiotherapy (RT). In the past, the RT is mainly used to control the local disease, precise HDR increase the local tumor control, and reduce the load transfer (Mujoo et al. 2018). More than 60% of patients with malignant tumors receive RT (Sun et al. 2018). But, humans are exposed to IR in RT and medical radiodiagnosis that cause degenerative diseases and oxidative damages (Mulinacci et al. 2018). Research has shown that the source of IR from diagnostic radiation examinations contributes ~ 40% of the total annual worldwide exposure from all sources in developed countries; it is the largest man-made source of radiation exposure (Tang et al. 2017). For example, over 70 million CT scans are performed annually in the USA alone (Brenner 2010; Moding et al. 2013). Recently, the risk of IR exposure in CT has drawn attention, and IR from CT can cause direct damage to DNA strands (Hall 2009). Owing to the potential carcinogenic effect of IR, the health of patients exposed to CT may be damaged (Schmidt 2012). Research has shown that the incidence of cancer, such as leukemia and brain cancer, was increased after exposure to a CT scan; the radiation doses from CT scans and leukemia have a positive association (Pearce et al. 2012; Mathews et al. 2013; Bharadwaj and Rocker 2016). These cancers are related to scans of the head, chest, abdomen, and pelvis (Berrington de González et al. 2009). It has also been shown that RT can induce heart diseases, including accelerated atherosclerosis, adverse myocardial remodeling, conduction abnormalities, and injury to cardiac valves (Martinou and Gaya 2013; Boerma et al. 2016). Among Hodgkin’s lymphoma patients who received radiation, CVD is one of the most common causes of death. Studies have shown that these patients have an increased risk of coronary artery disease, valvular heart disease, congestive heart failure, pericardial disease, and sudden death (Baselet et al. 2016). Survivors of childhood cancer are at high risk of developing late side effects of RT (Akam-Venkata et al. 2016; Tukenova et al. 2010). Relationship between the exposure position of medical radiation and health effects is shown in Table 1.

Table 1 Relationship between the exposure position of medical radiation and health effects

Natural environmental and daily life radiation exposure is unavoidable

Humans are unavoidably exposed to low doses of natural IR. Indeed, life on earth has always been exposed to IR from natural sources. For example, the unavoidable human exposure to natural radon radiation in particular increases the risk of lung cancer in the home, especially for smokers. Research has shown that the annual per capita radiation dose was 3.6 mSv in the early 1980s, and medical sources contributed only 0.54 mSv to this dose, with the remainder attributable to cosmic rays, radon, soil, and construction materials. The risk of childhood leukemia was increased owing to fallout from nuclear weapons testing leading to low-level internal exposure, which is consistent with the risks estimated in the LSS. Researchers have considered that there is a statistically significant risk of leukemia related to exposure to natural sources of gamma radiation. The likelihood of misuse or accidents has also increased owing to the widespread use of IR in daily life (McLean et al. 2017; Zeegers et al. 2017; Kutanzi et al. 2016; Thaker et al. 2015). Health effects of low-dose and low-dose-rate IR are shown in Fig. 2 and as shown in Fig. 3, IR can cause the risk of cancer and other conditions. Molecular and cellular mechanisms of health effects of IR are shown in Table 2.

Fig. 2
figure 2

Health effects of low-dose and low-dose-rate IR

Fig. 3
figure 3

IR can cause a range of human injuries, which can in turn increase the risk of cancer and other conditions

Table 2 Molecular and cellular mechanisms of health effects of IR

Radiation prevention and protection

Protection standards and radioprotective drugs

It is important to protect people from the damage due to IR, so, in 1973, a principle named ALARA was introduced. ALARA is an acronym that stands for “as low as reasonably achievable”. This principle refers to minimizing the duration of exposure to radiation, maximizing the distance from the radiation source, maximizing shielding between individuals and a radiation source, and minimizing the amount of radioactive material. The aim of this principle is to minimize the amount of radiation exposure to patients (Mitchel 2015; Petersen et al. 2012; Christensen et al. 2014a). The linear nonthreshold model has been adopted in the development of radiation protection standards (US National Academy of Sciences 2006; ICRP 2007). Moreover, numerous radioprotective drugs have been developed. Domina (2016) developed and proposed a new classification of radioprotective drugs: (i) radioprotectors, which are antiradiation drugs that act via physical and chemical protection; (ii) radiomitigators, which play a role at the systemic level; and (iii) radiomodulators, which can increase the resistance of the body to adverse environmental factors (Weiss and Landauer 2009). Radiomitigators can accelerate post-radiation recovery of radiosensitive tissues through the activation of a number of anti-inflammatory signaling pathways and increase the secretion of hematopoietic growth factors (Vasin 2013). Their highest activity appears almost exclusively during radiation injury of the hematopoietic system. These drugs are natural compounds, which exert antimutation, anti-inflammatory, and antioxidant effects (Izzi et al. 2012). Their mechanism of action involves increasing the general resistance of the organism and decreasing the cancer risk (Epperly et al. 2011). At present, it is considered that xanthine nucleoside, caffeine, and inosine can reduce radiation-related risks. Research has also shown that caffeine can activate the mechanisms of DNA repair and post-radiation recovery (Popova et al. 2014). Moreover, potassium iodide can block the thyroidal intake of radioactive iodine, which can reduce the risk of thyroidal cancer. Thus, the United States Food and Drug Administration approved potassium iodide as a protective agent against radioiodine exposure (Kutanzi et al. 2016).

Protection of staff, patients, pregnant women, and the general public

Owing to the rapid development of medical techniques based on ionizing radiation in the medical field, many procedures entail radiation exposure to both patient and operator. The high levels of IR exposure to the patient and treating staff are a major social and scientific problem. The use of thyroid shields, overhead radiation shields, and lead aprons can reduce the radiation doses to the operators’neck and head; these measures are used in most interventional radiology fluoroscopy rooms and cardiology imaging laboratories (Picano et al. 2012). All imaging staff should minimize both radiation dose and exposure according to the ALARA principle, for which shielding is one effective measure (Curtis 2010). Patient diagnosis is performed at the lowest possible dose, while maintaining suitable image quality. Moreover, a review of the application for examination by a medical staff can eliminate unnecessary inspections and duplicate commands, and use alternative imaging modalities to replace IR, such as ultrasound. It is important for radiation protection of patients that the medical staff are aware of alternative examinations and the measure of dose reduction (Darnell and Morrison 2016; Slovis 2002a, b). The United States Food and Drug Administration issued recommendations about reducing the risk of CT in pediatric and small adult patients. These recommendations include the following: reduce multiple scans and eliminate inappropriate referrals for CT and tube current modulation (Food and Drug Administration 2002). Moreover, in recent years, various organizations have put forward measures for radiation prevention and protection, such as limiting the medical radiation dose and requiring employees who may be exposed to radiation to monitor and report their radiation exposure (Wood 1994). Medical personnel are also required to provide a report of the radiation dose when patients are exposed to IR. In 2012, a Californian law was introduced that requires that certain dose parameters of all diagnostic CT examinations be reported in radiological reports (Boone et al. 2012). For pregnant women, the prevention and protection measures are essentially the same as those for the general public. These measures include the following: keep a safe distance, do not ingest contaminated water and food, and shield the human body from exposure (Mettler Jr and Voelz 2002). Pregnant women should cover their mouths and noses to minimize inhalation when potentially exposed to alpha particles in the environment (Harrison and Stather 1996). If pregnant women are exposed to radiation, they should rinse the exposed parts with water immediately to mitigate the long-term effects (Oak Ridge Institute for Science and Education (ORISE) 2011; IAEA 2005).

Education

The measures for radiation prevention and protection for patients and medical personnel can be classified into two broad categories: reducing the radiation dose and reducing unnecessary tests; education is necessary to achieve this, such as individual education and technologist training (Thaker et al. 2015). Patients are usually less aware than the medical staff of the effects of radiation. The primary method for reducing the radiation exposure of medical staff and patients is to build awareness of radiation physics and the harm that radiation can cause. Education can increase the awareness of personnel working with radiation of the damage caused by it and improve compliance with measures to limit exposure. Research has shown that annual use of educational software can increase awareness, strengthen the knowledge about using protective shields, and reduce the doses to which staff and patients are exposed. Reducing the radiation dose can be achieved by using low-dose protocols in the operating room, using newer low-dose imaging devices or nonradiation devices, and using pulse mode and more. The treating physician should advocate the reduction of radiation exposure (Hadelsberg and Harel 2016). Increasing the awareness of radiation protection among operators is very effective for reducing professional exposure, by up to 90% (Vaño et al. 2006). Medical staff are critical in this context as they can provide appropriate education to pediatric patients and their families. One of the most valuable ways of minimizing the risk of IR is to provide education to medical staff, doctors, patients, and parents (Darnell and Morrison 2016). High-quality education and training programs should deepen the understanding of radiation and raise awareness of the importance of radiation protection.

Screening of ionizing radiation biomarkers

Nuclear terrorism and radiological events are potential threats to human life, potentially leading to large numbers of people being exposed to radiation. Because of the threats posed by nuclear terrorism or radioactive accidents, we need to identify biomarkers that can be used to rapidly assess individual doses of radiation; biomarkers can not only enable estimation of the dose of radiation and detect the effects of radiation on health (Pernot et al. 2012), but are also applicable to clinical and medical management. Early biomarkers of radiation injury are critical for triage, treatment, and follow-up of large numbers of people exposed to IR after terrorist attacks or radiological accidents, and for the assessment of radiation toxicity before, during, and after RT treatment (Guipaud 2013). Zeegers et al. showed that biomarkers can be used to estimate the absorbed radiation dose in certain accidents involving nonnuclear workers and the general public (Zeegers et al. 2017). However, the effects on health of long-term exposure to radiation are unclear. Therefore, there is a need for suitable molecular markers to clarify these effects (Shimura et al. 2013).

Proteomics is an active area in radiation biomarker research

The definition of a biomarker is any measurement of the interaction between biological systems and environmental factors. Biomarkers may be physical, chemical, or biological (WHO 1993). They can be used in epidemiological surveys for a variety of purposes (Grandjean 1995), including individual susceptibility surveys and early detection of the effects of radiation on health, as well as the estimation or validation of exposure dose. Research has shown that biomarkers of continuous effects can assess the effects of radiation after long-term exposure. The organism responds to radiation by changing the expression levels of proteins and their post-translation modification state. Thus, biomarkers associated with radiation have been identified by screening differentially expressed proteins in samples of biological samples and tissue. The use of tissue samples in this context is mainly limited to radiation-induced cancer. The main advantage of using tissue is that protein expression is tissue-specific and some biomarkers will be expressed to the maximum extent in the target tissue. The main drawback of using tissue samples is that such specimens are difficult to obtain, requiring a biopsy or autopsy; by contrast, protein samples of biological fluids (e.g., urine, serum, or saliva) are collected noninvasively or semi-invasively, and high-throughput proteomics technology can be used to quantify protein expression induced by radiation (Pernot et al. 2012; Tapio 2013). Research has indicated that emerging proteomics methods are promising and powerful tools for discovering new biomarkers of radiation exposure (Zeegers et al. 2017). Quantitative proteomics techniques are the most common strategy to identify tumor markers (Rosell et al. 2013; Pernemalm et al. 2013). Because of the dynamic range of proteome and the intermediate resolution of early instruments, it was difficult to find new biomarkers in body fluids. However, with the development of a fractionation method combined with high-precision and high-resolution mass spectrometry (Pernemalm and Lehtio 2014), along with progress made in affinity array analysis (Ayoglu et al. 2011; Gold et al. 2012), researchers have often applied proteomics techniques to find biomarkers. For example, Yi et al. used a proteomics approach to identify several differentially expressed proteins in liver tissues of C57BL/6J mice receiving low-dose 137Cs radiation for 180 days; they found that CRT protein is a potential candidate low-dose or low-dose-rate IR early-warning biomarker (Yi et al. 2017). Moreover, Byrum et al. used proteomics to analyze plasma of nonhuman primates receiving Co-60 whole-body irradiation; they found a panel of plasma proteins with characteristic time- and dose-dependent changes, which are potential biomarkers of radiation exposure (Byrum et al. 2017). Research has also shown that, for 30 locally exposed clinical patients receiving fractionated radiation treatment, as well as three radiological accident victims exposed in 1994, 2D gel electrophoresis-based proteomics techniques can be used to identify the level of protein expression; the findings showed that the proteomics techniques applied were unable to confirm the change of the proteome of locally irradiated patients, but such changes were observed for accident victims (Nylund et al. 2014). When used in association with multivariate statistics, proteomics can measure the levels of hundreds or thousands of proteins simultaneously and identify the proteins that distinguish individuals from different groups (Guipaud 2013). Researchers have found many valuable biomarkers by using proteomics methods (Bystrom et al. 2014; Hathout et al. 2014; Mehan et al. 2014; Quon et al. 2016).

Genomics: high-throughput technology for identifying biomarkers

Ideal biomarkers have some common characteristics, such as reproducibility, sensitivity, known variability, and specificity. To be used for large-scale molecular epidemiological studies, ideal biomarkers must also be analyzable by collecting biological samples using noninvasive procedures (Ryan et al. 2007). Two effective methods to the identification of biomarkers are microarray and proteomics strategies. Gene expression profiling is an informative approach. Array-based gene expression can provide some valuable information about IR-specific genes (McLean et al. 2017). Previous studies used microarray-based gene expression analysis to provide a comprehensive overview of the biological effects of low-dose IR (Rosenstierne et al. 2014). Gene expression is sensitive to environmental factors, so gene expression profiling has been used to assess the presence of certain diseases. It is impossible to make such discoveries without the development of high-throughput technologies such as microarray platforms, MS, ChIP-on-chip, and next-generation sequencing. To strengthen these technologies, the development of sophisticated statistical methods is needed to accurately interpret the high volume of data generated from such experiments. A combination of the aforementioned techniques is often used, which gives better results (Reisz et al. 2014). Real-time quantitative reverse-transcription polymerase chain reaction can accurately monitor gene-expression changes (Rezaeejam et al. 2015). For example, Brengues et al. used a quantitative nuclease protection assay to analyze in vitro irradiated blood samples to determine the change of gene expression levels. Radiation biologists have also used gene expression profiling to find biomarkers that can be used to evaluate individual exposure doses under different exposure conditions (Paul et al. 2011; Brengues et al. 2010; Turtoi et al. 2010). Researchers confirmed 29 differentially expressed genes that are involved in the cell cycle and used these genes to predict low doses of radiation exposure (Lu et al. 2014). Researchers also used microarray analysis to study the radiation response (Amundson et al. 1999). Moreover, Zeegers et al. used an Illumina microarray chip to determine differentially expressed genes following -ray exposure in human lymphocytes. They confirmed that the expression of about 70 genes changed, indicated that upregulated genes are preferable to downregulated ones for use as biomarkers, and mentioned that specific circulating proteins or microRNAs may be more useful as potential emerging biomarkers of radiation exposure. However, gene expression analysis is expensive and time-consuming, and more powerful research is needed to verify its applicability for radiation dose estimation (Zeegers et al. 2017).

Metabolomics: a powerful approach for identifying and quantifying biomarkers of IR exposure

Metabolomics refers to the distribution of metabolites in biological fluids, cells, and tissues. Because of the intrinsic sensitivity of metabolomics, subtle changes in biological pathways can be detected to provide insights into various physiological and abnormal processes. Study in the field of metabolomics has contributed to the broader field of systems biology (e.g., incorporation of genomics and proteomics) by providing a comprehensive view of the small molecules that exist in cells, tissues, or biofluids (Johnson et al. 2016). Metabolomics has been proposed as a tool for high-throughput biodosimetry and the rapid assessment of exposure dose and classification (Pannkuk et al. 2017b). Metabolomics technology can measure hundreds of molecules simultaneously (Jelonek et al. 2017) and can rapidly determine an individual’s exposure level and metabolic phenotype (Pannkuk et al. 2017a). Owing to its rapid assays, high throughput, and minimally invasive sample collection, metabolomics has also been used as a tool for radiation exposure biomarker discovery in recent years (Zhao et al. 2017). Golla et al. used ultra-performance liquid chromatography (LC)-electrospray ionization-quadrupole time-of-flight mass spectrometry (MS) to analyze the differences in the metabolic signatures between sham and γ-ray-irradiated groups. They found that 3-methylglutarylcarnitine, a novel metabolite in urine, the liver, and serum, could potentially be an early radiation biomarker (Golla et al. 2017). Researchers used 1H nuclear magnetic resonance (NMR)-based metabolomics to analyze the effects of full-body γ-rays or protons on liver metabolism in C57BL/6 mice. The study showed that different radiation sources induced different changes in metabolic characteristics. The metabolites of 4-hydroxyphenylacetic acid, betaine, glutamine, choline, and trimethylamine n-oxide may be biomarkers for the prediagnosis of liver IR (Xiao et al. 2017). Researchers used biphasic liquid–liquid extraction to extract serum sample lipids and metabolites and used ultra-performance LC quadrupole time-of-flight MS to analyze the changes of the global nonhuman primate serum lipidome and metabolome. They showed that radiation exposure caused significant perturbations in lipid metabolism and provided information for the development of metabolomic biomarker panels in human-based biodosimetry (Pannkuk et al. 2016). NMR- and MS-based metabolomics have also emerged as a powerful method for identifying and quantifying biomarkers of IR exposure in both cell culture and in vivo animal studies, including in nonhuman primates. One-dimensional and two-dimensional NMR provides structural information for metabolomics studies; so, it is a key tool for the characterization of new metabolites. However, the high complexity of the metabolome and the low sensitivity of NMR spectroscopy make it difficult to apply metabolomics as an independent method. Although NMR spectroscopy requires a larger sample than other analytical techniques, it provides valuable information about IR biomarkers. Metabolomics is an effective method for identifying and quantifying biomarkers of IR exposure when used in conjunction with proteomics and genomics. Moreover, as well as the increased sensitivity of MS-based metabolomics methods, they can use gas chromatography, LC, or capillary electrophoresis to carry on online analytical separation of metabolites (Reisz et al. 2014; Chen et al. 2011; Khan et al. 2011).

Lipidomics: a new frontier of omics research for discovering biomarkers

Lipids are components of cell and are the most abundant metabolites in circulation. Lipids play a core role in signaling and metabolism, and their alterations are relevant to disease (Carter et al. 2017). Lipidomics is an emerging science that studies the changes of lipidome by using high-resolution mass spectrometry and can be used to identify potential biomarkers; it is also a strong tool for gaining insight to molecular mechanisms of disease. Lipidomics based on mass spectrometry provide an opportunity for comprehensive analysis of lipid in biological samples and can identify and quantify large amounts of lipids, and show the changes of lipids in metabolic disorders (Afshinnia et al. 2018). Researchers used gas chromatography and high-resolution mass spectrometry to obtain lipidomic profile and lipoperoxidation biomarkers in plasma of patients with rectal adenocarcinoma (Fernandes Messias et al. 2018). The study about an obese mouse model of osteoarthritis showed serum and synovial fluid fatty acids as predictive biomarkers of osteoarthritis in obesity (Wu et al. 2017). Research has shown that ionizing radiation can cause changes in lipid metabolism, lipidomic profiling indicated the level of the polyunsaturated fatty acid was significantly increased in the serum of nonhuman primate exposed to ionizing radiation (Pannkuk et al. 2016). Lipidomics play an important role in understanding sepsis mechanisms and has great potential in discovering biomarkers. The study aims to identify potentially sepsis biomarkers by determining the changes in the red cells, blood, and plasma lipidome profiling (Mecatti et al. 2018). Lipidomics analysis has been extensively applied to the study of systemic diseases; research has shown that lipidomics as a promising tool for discovering new-generation biomarkers for hyperlipidemia and cardiovascular diseases (Rai and Bhatnagar 2017). Although lipidomic has grown rapidly over the past decade, there are still significant obstacles to achieving accurate and comprehensive evaluation of lipidome. In the context of system biology, lipidomic can be integrated with other -omics platforms (such as proteomics and metabolomics) to comprehensively analyze biological samples.

Conclusion and perspectives

In this paper, we introduce and summarize the current knowledge on IR. This paper summarizes the definition, classification, hazards, and protective measures of IR; it also reviews recent advances in IR biomarkers. IR is ubiquitous in daily life owing to its widespread use in the medical, industrial, and scientific research fields, among others, as well as because of accidental exposure, NC, and terrorist threats. It is thus important to be aware of the hazards posed by IR and effective measures for protecting against it.

IR can cause serious damage to organisms. It can directly destroy cells and tissues and can cause significant changes in the components of cells, or damage them through reactive free radicals, leading to an increased risk of cancer. Because of the risks of nuclear terrorism, radioactive accidents, improper disposal of equipment containing radioactive materials, or medical errors, we need to identify biomarkers for rapidly assessing individual doses of radiation and implement necessary protective measures. From the studies reviewed here and many others, we summarize certain methods for identifying biomarkers, such as proteomics, genomics, metabolomics and lipidomics. “Omics” approaches are the most promising methods for discovering radiation biomarkers. Researchers have used these methods to find valuable biomarkers. Biomarkers can not only be used to estimate the dose of radiation and determine the effects of radiation on health but are also applicable to clinical and medical management. Ideal biomarkers have some common characteristics, such as reproducibility, sensitivity, known variability, and specificity.

At present, although many countries and organizations have formulated standards and measures for protecting against radiation and radiation education has also been widely carried out, the hazards that radiation poses to organisms cannot be neglected. Many studies have also found some potential radiation biomarkers, but no universal radiation biomarkers have been identified. Owing to the complexity of radiobiological effects, most potential biomarkers are dose dependent and time dependent. In fact, it is difficult to find a single biomarker that is sensitive and specific in a given radiation exposure. Thus, a multiparameter radiation exposure assessment method is more realistic. The future of radiation research could involve the formulation of a comprehensive method for finding radiation biomarkers and identifying the precise mechanism by which radiation damages organisms. Future studies can combine proteomics, genomics, metabolomics and lipidomics with knowledge of the mechanisms behind radiation injury and radiation pathophysiology to identify a common, single biomarker.