Introduction

The period included between the conception and the first 2 years of life, known as the “first 1000 days”, is fundamental to influence long-term child health outcomes. Particularly, this time period constitutes a golden opportunity to program normal brain development [1]. Neurodevelopment is a highly complex process. Although it is mostly genetic, preprogrammed, and experience-independent, there are environmental factors that can profoundly influence early brain development. Some of these factors are beyond our control, but nutrition is not [2, 3]. In the last decades many studies showed that nutrition plays a critical role: firstly, being responsible of providing those substances required for the creation of the early brain structure and, secondly, being responsible of supporting and preserving its healthy functioning [4]. In addition, accumulating evidence suggests that diet influences gene expression through epigenetic mechanisms. These epigenetic changes, especially if experienced during the early development, could be responsible for later life diseases [5,6,7]. Several preclinical and clinical studies have helped to elucidate the role and mechanism of individual macro- and micronutrients on brain development [4]. While almost all nutrients are required, a subset of nutrients plays a particularly significant role in a variety of critical neurodevelopmental processes across brain regions, supporting the high rate of brain metabolism during early life. These include macronutrients (e.g., protein, long-chain polyunsaturated fatty acids, and glucose) and micronutrients (e.g., iron, iodine, zinc, and vitamins) [8••]. Over time, evidence has demonstrated that, regardless of the cause, impairments in the early life neurodevelopment have long-term consequences on health, with respect to education, job potential, and adult mental health [9, 10]. Considering the worldwide diffusion of early life malnutrition in all its forms—undernutrition, micronutrient deficiencies, obesity, and diet-related noncommunicable diseases—and the related unacceptably high long-term economic and societal repercussions, policy makers should focus their attention on planning and making appropriate assessments and interventions, in order to optimize nutrient supplies during the most vulnerable period of brain development. This review will focus on the importance of early life nutrition to ensure optimal brain development, first by describing the important milestones of neurodevelopment, overviewing the basic principles that regulate the nutrient-brain interactions, and then presenting recent findings about the most common micronutrient deficiencies (iron, zinc, and iodine) and the associated risk of neuropsychiatric adverse consequences.

Early Brain Development

Human brain development is a protracted process that begins in the third gestational week with the differentiation of the neural progenitor cells and extends at least through late adolescence, arguably throughout the lifespan [11]. By the end of the embryonic period (eighth week post conception), the rudimentary structures of the brain are established and the major compartments of the central and peripheral nervous systems are defined [11]. After the external form is established, a complex series of processes begins. Between the second and fourth months of gestation, major proliferative events occur: progenitors of neurons and glia are produced and proliferate. From approximately 5 months of gestation to early postnatal life, there is glial multiplication [12]. As progenitors cells proliferate and differentiate, they migrate from their sites of origin to different brain areas where they need to make connections with other neurons. The peak time for this occurrence is from the third to fifth months of gestation [12]. Once the cells have reached their target region of the brain, in order to become integrated into neural networks, they have to develop neural processes (dendritic and axonal ramifications) that allow them to communicate through synaptic contacts. These organizational events, which occur from approximately the fifth month of gestation to several years after birth, are crucial to establish the awesome circuitry that identifies the human brain. The efficiency of information transmission in the pathways is greatly enhanced by myelin which ensheathes the axons. The period of myelination in humans is long, being rapid and dramatic from the second trimester of pregnancy to the first 2 years of postnatal life and continuing into adult life [11].

Principles of Nutrient Effects on Brain Development

In the first 1000 days of life, the brain faces an extraordinary growth, increasing its dimension, differentiating gradually in a highly specialized organ, and slowly losing plasticity. The rate of growth in this period is the highest throughout the lifespan. In general, the higher the rate of an organ’s growth, the greater is its risk to be damaged by insufficient supply of nutrients. This makes the developing brain greatly susceptible to damages [13]. The brain is a heterogeneous organ. It is comprised of distinct anatomical regions (hippocampus, cortex, and striatum) and processes (e.g., myelination, neurotransmitters), each with unique developmental trajectories and a set of nutrient requirements [13]. Many of these regions or processes have developmental trajectories that begin and accelerate in fetal life or shortly after birth. Furthermore, every region and every process have two crucial moments: the critical period and the sensitive period. The borders between the two periods are blurred. However, they can conceptually be defined as follows: the former is an early life epoch where irreversible long-term consequences follow insults; the latter represents broader epochs when the brain is more susceptible to environmental factors, such as nutrient deficiencies, but the effect is not inevitably permanent [14, 15]. As such, there are a series of nutrient- and tissue-specific critical periods throughout the development, and the effects that nutrient deficiencies have on the vulnerability of the brain are determined by two factors: the timing of the nutrient deficiency and the specific region’s requirement for that nutrient at that time [16••]. Failure to construct a brain region during its critical period can lead to permanent consequences, such as residual structural defects [17], persistent neurochemical and electrophysiological abnormalities, and altered gene expression [18,19,20, 21•]. These mechanisms potentially explain the biological bases of long-term effects of early life nutritional perturbations (Fig. 1). Thus, ensuring adequate nutrient intake is necessary to allow a time-coordinated brain development and to create an integrated healthy working brain structure.

Fig. 1
figure 1

Interactions between genes and environment shape human development. Early life nutrition (prenatal and neonatal) represents a fundamental environmental factor that can profoundly affect brain development. Early life malnutrition, during critical periods of neurodevelopment, can create an altered brain structure and induce altered patterns of epigenetic markers, leading to brief- and long-term adverse health consequences with respect to cognitive, social, emotional, neurological, and psychiatric performances

Epigenetics

The first epidemiological evidence of a link between inadequate early nutrition and diseases emerged from cohort studies examining health status of offspring whose mothers were pregnant during periods of severe famine, such as the Dutch Hunger Winter. These studies demonstrated that individuals prenatally exposed to severe famine were more likely to have coronary heart disease, atherogenic lipid profile, disturbed blood coagulation, increased stress responsiveness, obesity, and glucose intolerance during adulthood [7]. Among all the diseases, there was also an increased risk of neuropsychiatric diseases, such as schizophrenia and affective disorders [22,23,24]. In the last few years, growing attention has been put on epigenetics. The emerging hypothesis is that part of these adverse health consequences could be mediated by epigenetic changes, induced by early life environmental risk factors such as inadequate or inappropriate nutrition. There is convincing evidence that early malnutrition has an impact on genomes with long-term consequences on health outcomes. By DNA methylation, histone modifications, and noncoding microRNA, epigenetics modulates the intensity and the timing of gene expression throughout the entire life course, potentially leading to later life behavioral consequences and diseases [6]. Several animal studies have demonstrated that prenatal exposure to stressing events (including malnutrition) induces lasting epigenetic changes in the brain, which have been linked to changes in brain gene expression, stress reactivity, and behavior [25]. On the contrary, in humans, it is very challenging to establish this epigenetic link, for different reasons: (i) they are exposed to a mixture of environmental factors that can have confounding effects; (ii) brain tissue is inaccessible in living humans. Although some studies showed epigenetics changes in neurodevelopmental-related genes, it still remains unclear if these alterations are the actual cause of the neurodevelopmental disorders [26, 27]. Many human diseases are the result of an interaction between genetic and environmental factors. Understanding the mechanisms through which dietary affects epigenomes and how these epigenetic changes are involved in determining neuropsychological phenotypes could potentially offer an opportunity for preventing or treating some mental illnesses [28]. Thus, more and better-designed studies need to be done, to deepen our knowledge about the relationship between nutrition, epigenetics, and neurodevelopment.

Further Research

All these considerations offer the opportunity not only to focus on ensuring the child with adequate nutrition to promote normal brain development but also to consider nutrition as a powerful instrument to optimize cognitive, social, emotional behavioral development and health outcomes. Yet, the literature related to this vast topic is frequently contradictory due to several reasons: critical and sensitive periods are challenging to define and often used interchangeably; timing dose duration of supplementation principle is not always clearly definable; assessment of connections with neurobehavioral tests and outcome measures is not always made with the finest tests at the correct assessment age. Studies often are extremely different from each other (e.g., in terms of age and population examined, duration and doses of supplementation, assessment tests, general health status, and sociocultural habits), explaining why meta-analyses are not the best tool to assess nutrient-brain relationships. In the attempt to increase statistical power by enlarging the sample size, all the nutrient nuances are lost and the type II error increases, inevitably leading to inconclusive results [29•]. As a result, finding or excluding nutrient-brain decisive connection remains still problematic [30••]. Further studies should take into account these factors, aiming to reduce confounding variables in order to obtain definite indications.

Nutrients That Influence Neurodevelopment

During pregnancy, macro- and micronutrient requirements increase. Adequate nutritional diet is a key to avoid adverse health consequences for the developing fetus. The 2014 Italian RDA, specifically, indicate an additional requirement of 69 kcal/d for the first trimester, 266 kcal/day for the second trimester, and 496 kcal/day in the third trimester of pregnancy (for a grand total of an additional 76,530 kcal). Very similar amounts have been established by the EFSA (70 kcal/day in the first trimester to 260 and 500 kcal/day in the second and third trimesters, respectively), with an increase of about 500 kcal/day during the first 6 months of exclusive breastfeeding [31]. A comprehensive review of macronutrients requirements is beyond the scope of this article. The following sections will focus on the most common micronutrient deficiencies. In Table 1, the estimated average requirements for the most common micronutrients are listed [32, 33].

Table 1 This table presents recommended dietary allowances (RDAs) in bold type and adequate intakes (AIs) in italic type. An RDA is the average daily dietary intake level sufficient to meet the nutrient requirements of nearly all (97–98%) healthy individuals in a group. It is calculated from an estimated average requirement (EAR). If sufficient scientific evidence is not available to establish an EAR, and thus calculate an RDA, an AI is usually developed. For healthy breastfed infants, an AI is the mean intake. The AI for other life stages and gender groups is believed to cover the needs of all healthy individuals in the groups, but lack of data or uncertainty in the data prevents being able to specify with confidence the percentage of individuals covered by this intake

Iron

Iron is an essential micronutrient involved in many biological processes, including neurodevelopment. It is necessary for normal anatomic brain development [34,35,36], myelination, and neurotransmission [37]. Iron deficiency is the most common micronutrient deficiency worldwide. Approximately 2 billion people around the world are estimated to suffer from this condition, roughly half of the preschool-aged children and pregnant women [38]. While during adolescence this condition is frequently reversible without consequences, early life iron deficiency can lead to long-lasting and potentially permanent effects, resulting in later life neurocognitive and behavioral disorders [39]. Many animal and human studies demonstrated a crucial role of iron in brain development [37]. Overall, it is well accepted that prevention of iron deficiency is preferable to treatment [40]. A set of randomized clinical trials conducted in Nepal demonstrated the known positive effect of prenatal iron to support hippocampal and striatum development. School-aged children whose mothers were supplemented with iron/folic acid during gestation and early neonatal life had better neurocognitive performances (working memory, inhibitory control, and fine motor functioning) than those born from not supplemented mothers [41]. Furthermore, the supplementation between the 12th and 36th months of life did not confer additional benefits on general neurointellectual functioning [42] nor cause a neurocognitive catch-up growth in those children who were not supplemented during gestation [43]. A Vietnamese study demonstrated that preconception supplementation with iron and folic acid, compared with exclusively folic acid supplementation, improved linear growth and fine motor development at 2 years of age [44]. A Chinese study proved that infants who received iron supplementation in infancy (between 6 weeks and 9 months) exhibited better gross motor scores at 9 months than children who did not receive iron [45]. In addition, infants born iron-deficient exhibited slower recognition of their mothers’ voice at 2 months of age, as measured by event-related potentials, than children born iron-sufficient. This is in line with the effect of iron deficiency on hippocampal development [46]. Formerly, iron-deficient infants were demonstrated to have significant slower reaction times and poorer inhibitory control 8 to 9 years after supplementation [47], as well as different patterns of functional brain connectivity [48]. A 10-year follow-up Chilean study verified that mistimed or excessive iron might lead to worse neurodevelopmental outcomes at 10 years [49]. In another trial conducted in Nepal, it was found that delayed cord clamping (≥ 180 s after delivery) reduces anemia at 8 and 12 months of age in a high-risk population, which may have major positive effects on infants’ health and development [50]. Longitudinal cohort studies showed that infants who experienced iron deficiency were more likely to experience cognitive and socioemotional impairments throughout infancy, childhood, and adolescence, with slower perceptual speed, poorer understanding of quantitative concepts, poorer spatial memory [51], impaired language abilities [52], and worse recognition memory [53]. Furthermore, at 25 years, a higher proportion of the group with chronic iron deficiency did not complete secondary school, were single, and reported poorer mental health and more negative emotions [54]. Moreover, in retrospective population-based cohorts, an association was found between maternal iron deficiency and risk of schizophrenia spectrum disorders among offspring [55], as well as an association between low iron intake and increased risk of autism spectrum disorders [56]. Collectively, these studies confirm the timing-dose-duration principles previously explained [30••]. In addition, they are consistent with the long-term consequences of early iron deficiency on neurodevelopmental processes and on neuropsychiatric susceptibility to diseases. Lastly, new insights have been provided into the mechanisms through which iron deficiency could affect neurocognitive performance. Among these, epigenetics seems to play a role. In a study performed on pigs, even though the sample size was relatively low, it was demonstrated that neonatal iron deficiency led to altered hippocampal DNA methylation and gene regulation, potentially leading to effects on neurodevelopment mediated by increased hypoxia-induced angiogenesis and increased blood-brain barrier permeability [57]. Another study, conducted on mice, showed that chronic iron deficiency during development alters the adult hippocampal transcriptomes and that restoring iron status during a known critical period of hippocampal neurodevelopment incompletely normalized these changes [21•]. Further detailed studies need to be designed to better understand the implication of such biological process.

Iodine

Gestation involves significant changes in maternal thyroid function. Even though fetal thyroid starts to produce hormones around the 18th–20th weeks of gestation, the major supply of thyroxine remains the mother. Since the first epidemiological studies, the association between severe iodine deficiency in pregnant women and fetal neurological damage has been demonstrated [58, 59]. Thyroidal hormones are involved directly and indirectly in essentially all key neurodevelopmental processes [60, 61]. Both neurons and glial cells (astrocytes and oligodendrocytes) are greatly provided with thyroid hormone receptors. A recent review from Velasco et al. masterfully enumerates trials, meta-analyses, and reviews that demonstrate which brain areas are affected by iodine deficiency and the cognitive and neurodevelopmental consequences [62••]. In the last two decades, increasing evidence has proved that adverse neurological outcomes are linked not only to maternal hypothyroidism but also to a condition known as hypothyroxinemia [63]. This is defined as the presence of a free thyroxine value below the 2.5th percentile with a thyrotropin level within the reference range. Hypothyroxinemia is not related to rural-insufficient iodine intake areas; evidence has showed that it exists even in iodine-sufficient regions [64,65,66]. The phenotypes related to iodine deficiency have evolved from goiter and severe mental disability to a new clinical spectrum of neuropsychological disorders associated with maternal hypothyroxinemia. In biopsies performed in experimental rodents, gestational thyroid hormone deficiency was found to cause dendritic and axonal growth limitation, neural abnormal location, synaptic function alteration, hystogenesis, and cerebral cortex cytoarchitecture alteration. As such hypothyroxinemia causes disrupted neocortical layering [67, 68]. Cognitive deficits and poor psychomotor development in the progeny of mothers who were hypothyroxinemic during the first half of gestation have been shown [67]. In two observational studies conducted in the UK and Australia, it was found that children of mothers with UIC < 150 μg/g creatinine were more likely to score within the lowest quartile on verbal IQ, reading accuracy, and reading comprehension at age 8–9 years [69] and had lower educational outcomes at age 9 years [65]. These findings were confirmed in a population-based prospective cohort of earlier children, where a UIC below ~ 100 μg/L was associated with lower infant language skills up to 18 months [70]. A prospective study from the Netherlands reported that low maternal UIC during pregnancy (< 10th percentile) was associated with impaired executive function in children at age 4 years [71]. A large Norwegian population-based prospective observational study concluded that a suboptimal maternal iodine intake (below the estimated average requirement of 160 μg/day) during pregnancy was associated with symptoms of child language delay, behavior problems, and reduced fine motor skills at 3 years of age. Surprisingly, no beneficial effect was associated with iodine supplementation during pregnancy [66]. A recent RCT in India and Thailand found no benefit of iodine supplementation on 5–6-years child cognition born to mothers with mild-to-moderate deficiency; however, it should be observed that the Indian women recruited were actually iodine-sufficient and that both countries adhere to iodized salt programs [72]. In the last years, some evidence emerged, though not consistent, that offspring of mothers with abnormal serum thyroid hormone concentrations during early pregnancy might present an increased risk of attentional problems, such as attention-deficit/hyperactivity disorder (ADHD). A Norwegian study found that a low iodine intake (< 200 g/L) was associated with increased child ADHD symptom scores, but not with ADHD diagnosis. Iodine supplementation did not reduce the risk [73]. Additionally, Román et al. [74] found a consistent association between severe, early gestation, maternal hypothyroxinemia and autistic symptoms in offspring. The possibility of preventive interventions must be further investigated.

Zinc

Zinc is an essential trace mineral for all forms of life because of its universal role in keeping cells operating. Inadequate zinc intake is common, especially in individuals and population whose regular diets do not include readily bioavailable zinc sources or are overreliant on cereals (rich with zinc inhibitors) [75]. Gestation and older infancy constitute periods of increased risk of zinc deficiency. Many preclinical studies demonstrated its key role in neurodevelopmental processes (such as neurogenesis, neuronal migration, synaptic genesis, and myelination) and modulation of intra- and intercellular signaling (GABAergic neurons) [75, 76]. While it has been well established that severe deficiencies cause serious brain structural malformations [77], less is known about mild-to-moderate deficiency affecting sensorimotor and cognitive development. An Indian DB-RCT showed that zinc supplementation till 3-month corrected age in preterm breastfed infants significantly improves alertness and attention patterns; it decreases signs of hyperexcitability and abnormal patellar and bicipital reflexes [78]. Among 8–18-month infant adoptees coming from three global regions (post-Soviet states, Ethiopia, and China), zinc deficiency was the second most common micronutrient deficiency and was associated with compromised memory functioning [79]. In 2002, a study investigated the effects of iron-folic acid and/or zinc supplementation on the results of the Fagan Test of Infant Intelligence and the A-not-B task of executive functioning among 367 Nepali infants living in Sarlahi District. Neither the combined nor the individual micronutrient supplements improved the performance five indicators of information processing [80]. A recent double-blind trial conducted in Tanzania randomized infants at 6 weeks of age to zinc, multivitamins, zinc and multivitamins, or placebo. At approximately 15 months, a subsample of children underwent developmental assessment using the cognitive, language (receptive and expressive), and motor (fine and gross) scales of the Bayley Scales of Infant and Toddler Development Third Edition. Neither daily zinc nor multivitamin (vitamins B-complex, C, and E) supplementation led to improvements in any of the developmental domains assessed [81]. A DB-RCT was conducted on Peruvian infants aged 6–18 months in order to determine the effects of prevention of zinc deficiency on cognitive and sensorimotor development during infancy. A set of assessments measured the cognitive development (attention, memory, and learning) and the developmental status, finding zinc to be supportive in sustaining normative neurodevelopment in the first 2 years of life [82]. Intriguingly, among individuals with autism spectrum disorders, the incidence rate of zinc deficiency in very young age has been reported to be significantly increased compared with age-matched healthy control subjects [83, 84]. In addition, two small case-control studies showed a correlation between low levels of zinc and attention-deficit/hyperactivity disorder [85]. Further studies are needed to clarify good interventions in terms of neurocognitive outcomes.

Conclusions

The studies reviewed here suggest that early life environment, particularly the fetal and early postnatal environment, influences later life health outcomes and disease risks across multiple organ systems. Among all the environmental factors, nutrition plays a fundamental role. From conception to approximately 3 years of age, the basis for lifespan brain functions is established. Children adequately nourished are more likely to reach their developmental potential in cognitive, motor, and socioemotional abilities, with positive societal repercussions. The restricted development of these skills during early life increases the risk for later neuropsychological problems, psychiatric illnesses, poor school achievement, early school dropout, low-skilled employment, and poor care of future children, thus contributing to the intergenerational transmission of poverty.

Epigenetics seems to play a leading role, explaining, at least in part, how early life stimuli can have these long-term consequences. Despite recent advances in technologies, our knowledge regarding nutritional epigenetics is still limited. Further studies are needed to better understand the use of nutrition for maintaining health and preventing diseases through modifiable epigenetic mechanisms.

In order to draw more evidence-based, relevant, and ready-to-use conclusions on nutrition, data collection and research must be conducted more regularly and more rigorously, with a higher focus on national and regional idiosyncrasies. Disaggregation of data based on wealth, age, gender, disabilities, and geography could lead to the establishment of more specific standards for diagnosis, treatment, and prevention of all forms of malnutrition. Being more aware of how nutritional status might vary within households even in the same region could support drawing policies to tackle long-term consequences of early life malnutrition.

Strategies of prevention should focus on ensuring more quality food to preconceptional, pregnant, lactating women and to children in their early life, not only in those areas where malnutrition is common but also in developed countries.

Lastly, although it is premature to propose diet as therapy for mental illnesses and more studies are needed to deepen our knowledge, optimizing nutrition could potentially help in preventing or attenuating some mental disorders.