During middle childhood, defined by psychologists as the period between 6 and 12 years of age (Collins 1984), children develop the emotional, cognitive, and social skills necessary to become educable members of society (Lancy and Grove 2011). The onset of middle childhood is associated with the social, emotional, and intellectual changes referred to as the “five to seven transition” by Piaget (1963), and its end by the dramatic physical and behavioral changes of puberty (Bogin 1999). In our own culture, with its emphasis on formal education, we link this transition to the onset of elementary school and formal learning. However, Sheldon White (1996) notes that in many other cultures, the onset of middle childhood is marked by a more informal process—the ability of children to successfully carry out tasks outside the home. White (1996) refers to the age of eight as the beginning of the “age of reason and responsibility” because of the social expectation that children can now operate semi-independently from their parents—or, as Lancy and Grove (2011) put it, because children are capable of “making sense.”

From a biological point of view, middle childhood appears to be closely related to the primate juvenile stage between infancy and adolescence in which individuals are no longer dependent on their mother for food but are not yet reproductively mature (see Thompson and Nelson 2011). The onset of juvenility in humans is defined by a set of somatic events that can occur anywhere between the ages of five and seven. These include the eruption of the first permanent molars (Blankenstein et al. 1990; Moslemi 2004), the development of adult locomotory efficiency (Kramer 1998), a rebound in adiposity (Hochberg 2008, 2009), the near completion of growth in brain volume (Caviness et al. 1996), the onset of cortical maturation in the brain (Gogtay et al. 2004; Shaw et al. 2008), and the development of axillary hair and odor (Kaplowitz et al. 1986; Leung and Robson 2008).

Around the same time, adrenal production of dihydroepiandrosterone (DHEA) and its sulfate (DHEAS) becomes apparent in circulation (Remer et al. 2005), making adrenarche at the very least an endocrinological marker of the juvenile transition. Moreover, both DHEA and DHEAS (hereafter referred to as DHEA/S, as a general term, since the two hormones are interconvertible; DHEA or DHEAS is used to describe specific results based on one or the other compound) have been linked to aspects of glucose metabolism (Perrini et al. 2004; Yamashita et al. 2005), neural function and development (see Pérez-Neri et al. 2008 and Maninger et al. 2009 for recent reviews), as well as the development of hair and sebaceous glands (Irmak et al. 2004; Stewart et al. 1992), suggesting that DHEA/S may play a direct role in at least some of the biological processes of middle childhood.

The parallel timing of increases in DHEA/S (Orentreich et al. 1984; Remer et al. 2005), cortical brain maturation (Gogtay et al. 2004; Shaw et al. 2008), and the development of bodily control (Reilly et al. 2008), social competency (White 1996), and cognition from age six to twenty-five are too striking to dismiss as simple coincidence. The key role of cortisol, another adrenal steroid, in regulating the energy supply to the brain (Peters et al. 2004) suggests that increasing production of DHEA/S, and its known antiglucocorticoid properties (Muller et al. 2006; Pélissier et al. 2006, 2004; Yildirim et al. 2003), may provide neuroprotection to the developing brain throughout middle childhood when glucose utilization is substantially elevated (Chugani 1998). Nonetheless, despite demonstrated behavioral effects of DHEA/S in rodents (e.g., Mizuno et al. 2006; Navar et al. 2006), clear evidence for effects of DHEA/S on behavior in children is almost entirely limited to clinical conditions (Dorn et al. 2008; Strous et al. 2001; Van Goozen et al. 1998, 2000). In contrast, we know relatively little about the role of DHEA/S in normal human development.

Pioneering arguments about the role of human adrenarche in the maturation of sexual behavior were based on the timing of initial sexual attraction and included little physiological or neurological evidence (e.g., Herdt and McClintock 2000). More recently, Del Giudice and colleagues (Del Giudice 2009; Del Guidice and Beslsky 2010; Del Guidice et al. 2009) have presented a vastly updated version of Herdt and McClintock’s argument, suggesting that adrenarche represents a developmental switch from infant attachment to reproductive attachment and as such marks a major change in children’s behavior. Furthermore, they argue that as a precursor to sex steroids, DHEA/S plays an important role in activating genes associated with reproductive behavior that leads to sexual differention of behavior, including increased aggression in males.

Del Giudice and colleagues provide a compelling argument for middle childhood as a distinctive human developmental stage with the emergence of new behavior patterns, and why DHEA/S may play a role in sex differences in these new behaviors. However, they do not consider the potential role of DHEA/S in somatic changes during middle childhood. Nor do they provide specific physiological mechanisms by which DHEA/S leads to both behavioral and somatic changes beyond acting as a precursor to sex steroids. More important, they do not address why DHEA/S levels continue to increase after the onset of puberty and into young adulthood.

In this paper I extend my earlier work (Campbell 2006) to develop a complementary argument focusing on the physiological mechanisms by which DHEA/S may serve to integrate human somatic, brain, and behavioral development beginning with middle childhood and continuing to young adulthood. I suggest three different mechanisms by which DHEA/S may play a role in coordinating development during this period: (1) DHEA/S may play a role in the allocation of glucose away from the brain and toward the development of both muscle and adipose tissue that will support the onset of puberty; (2) within the brain, DHEA/S’s neuroprotective effects (Li et al. 2009) may help to maintain synaptic plasticity in late-developing parts of the cortex, even as other parts of the brain undergo maturation (Gogtay et al. 2004; Shaw et al. 2008) and lose plasticity; and (3) DHEA/S may promote the development of body odor (Kaplowitz et al. 1986) as a social signal of the emotional, cognitive, and social changes associated with middle childhood.

Before continuing, I would like to present three caveats. First and foremost, my argument is a series of speculations about the impact of DHEA/S on the brain and the body, often based on cellular and physiological processes. However, I have tried to make the links in the argument as concrete as possible so that each one is open to testing and potential falsification. Second, I have tried to limit myself as much as possible to endocrinological findings from human subjects because they are clearly the most relevant to the study of our own species. However, cellular-level evidence is thought to be less variable and hence relevant across species. Third, I consider the mechanisms I present here to represent a separate and complementary perspective on adrenarche to that of Del Guidice and colleagues.

Somatic Changes Associated with Adrenarche

As previously mentioned, the onset of adrenarche around the age of seven appears to coincide with a suite of somatic developments, all of which can be related to increasing self-sufficiency. These include the eruption of the first permanent molar (Blankenstein et al. 1990; Moslemi 2004; Wedl et al. 2005), which allows for a wider diet; the development of an adultlike gait (Vaughn et al. 2003), which enables greater mobility for foraging; the adiposity rebound (Hochberg 2008, 2009), which signals greater energy storage; and the near completion of growth in brain volume (Caviness et al. 1996), which signals changes in brain development.

Closer inspection of each of these developmental processes suggests a cascade of events across the onset of middle childhood which varies among individuals. The earliest is the eruption of permanent teeth, which can begin as early as age five in some individuals (Blankenstein et al. 1990); the adiposity rebound occurs on average at five and a half years, well before age seven (Hochberg 2008, 2009), the development of adultlike motor control around the age of seven (Reilly et al. 2008), and the completion of brain growth at approximately eight years of age (Caviness et al. 1996). Thus, rather than a discrete event, the onset of somatic changes defining middle childhood appears to be a process, similar to Piaget’s five to seven transition but distributed across several functional systems.

Similarly, even though adrenarche is sometimes discussed in the clinical literature as if it occurred at seven years of age for all individuals, the onset of the development of the adrenal zona reticularis, the source of almost all circulating DHEA/S, ranges from three to eight years of age (Dhom 1973). Recent results suggest that while DHEA/S is produced as early as age three, appreciable levels of DHEA/S do not reach general circulation until around the age of seven or eight (Remer et al. 2005). Given the degree of individual variability in adrenarche, its relationship to other markers of the juvenile transition is unclear, to say the least.

Some evidence is available for the relationship of adrenarche and changes in body composition during middle childhood. Based on longitudinal results from twenty children (10 girls, 10 boys), Remer and Manz (1999) and Remer (2000) report that across individual children, the maximal one-year increase in DHEAS is associated with the maximal one-year increase in BMI. These results suggest that the expression of adrenarche is associated with a shift in metabolism, evident in terms of body composition. The specific mechanism(s) linking adrenarche and body composition remain to be clarified, but they may reflect an interaction of DHEA and leptin associated with adipose metabolism (Nawata et al. 2010), as well as the development of the adrenal cortex as part of a coordinated sympathoadrenal unit that, among other things, regulates glucose metabolism (Goldstein and Kopin 2008).

On the other hand, the mid-childhood growth spurt, a transient increase of growth in height between the ages of six and a half and eight and a half years (Molinari et al. 1980), does not appear to be associated with adrenal adrogens as once thought (Mühl et al. 1992). In normal children DHEA is related to the development of bone diameter in prepubertal children (Remer et al. 2003, 2009, but see Remer et al. 2004) but has little effect on skeletal growth (Reinehr et al. 2006). Furthermore, adrenarche occurs approximately a year after the mid-childhood spurt (Remer and Manz 2001), ruling out increases in DHEA/S as a cause of such a growth spurt.

Neither of the two other important markers of middle childhood, dental eruption patterns and timing of brain growth, have been related to adrenarche, though to the best of my knowledge an association has not been formally tested. It seems unlikely that dental eruption patterns would be affected by DHEA/S. On the other hand, DHEA/S could act directly on the brain itself or indirectly by biasing glucose metabolism in the rest of the body. But before I can present evidence for these potential neurological mechanisms, it is important to outline how the production of DHEA/S at adrenarche may play a broader role in the ontogeny of glucose metabolism.

Adrenarche and the Development of the Adrenal Response

As previously mentioned, endocrinologically, adrenarche is defined as the onset of production of adrenal androgens, most prominently dehydroepiandrosterone (DHEA) and its sulfate (DHEAS), under the stimulation of adrenocortical trophic hormone (ACTH) from the pituitary—hence the name adrenarche (Nakamura et al. 2009). Though the term implies a specific event, there does not appear to be a specific trigger for adrenarche. ACTH, which stimulates the production of cortisol as well as DHEA/S, does not increase at adrenarche. Instead, increasing levels of DHEA/S are associated with the growth of the zona reticularis layer of the adrenal gland, the only area within the adrenal cortex capable of producing DHEA/S (Auchus and Rainey 2004). In contrast, cortisol, produced in the zona fascularis of the adrenal cortex, actually appears to show a slight decrease from four to eight years of age (Wudy et al. 2007), about the same time that the zona reticularis begins to develop (Dhom 1973).

Anatomically, the development of the zona reticularis in humans has long been known to begin around the age of three years (Dhom 1973). Based on a cross-sectional sample of adrenal glands taken at autopsy, Dhom (1973) found that cells of the zona reticularis are present in some children from the age of three years on, but not before. By the age of eight, all of the adrenal glands examined showed some degree of zona reticularis development, though it was not until the age of fifteen that the zona reticularis was continuously present in all of those examined. Given that adrenal androgen production continues past the age of fifteen, it has been suggested that further increases may reflect growth in the depth of the zona reticularis (Remer et al. 2005).

It is important to keep in mind that Dhom’s results are cross-sectional; in other words, the presence or absence of zona reticularis cells reflects individual variation in the onset of zona reticularis development. Thus, in some children the onset of zona reticularis development occurs at the age three, while for others it occurs around the age of eight. This remarkable degree of variation for such an important developmental process deserves further attention.

The timing of increases in circulating levels of DHEA/S are consistent with the timing of zona reticularis development. Adrenal androgens start to increase around three years of age in both clinical (Palmert et al. 2001) and normal samples of children (Remer and Manz 1999; Remer et al. 2005), coincident with the earliest appearance of the zona reticularis (Dhom 1973). The major urinary metabolites of DHEA/S all show clear increases between the ages of three and seven years, a clear indication of DHEA/S production (Remer et al. 2005). However, DHEA/S itself is not clearly evident in urine until the age of seven or eight years, consistent with the emergence of the zona reticularis in all children at that time.

Both DHEA and DHEAS are converted to metabolically active forms within target tissues (Labrie et al. 1998). Thus the difference between urinary levels of DHEA and its major metabolites before the age of seven is important because it suggests that DHEA produced by the zona reticularis is being metabolized within target tissues. Furthermore, the low levels of DHEA in circulation prior to the age of seven suggest that the tissues metabolizing DHEA may not be terribly far from the site of its original production. In other words, during the very early stages of the development of the zona reticularis, any DHEA/S produced may be taken up by organs in the abdomen that are in close proximity to the adrenal cortex. These would include the liver and kidney (Anzai et al. 2006), all of which express organic anion transporters (OAT) capable of transferring DHEA across the cell membrane.

It is the adrenal medulla, the core of the adrenal gland, that is in the closest proximity to the zona reticularis, the inner layer of the adrenal cortex. In fact, zona reticularis cells actually physically intermingle with adrenomedullary cells (Ehrhart-Bornstein et al. 1994). Developmentally, the emergence of the zona reticularis reflects the breakdown of the medullary capsule that separates the adrenal cortex from the medulla and its effective replacement with the zona reticularis (Dhom 1973).

Such intimate contact between the zona reticularis and the adrenal medulla allows DHEA/S to directly impact the production of catecholamines (adrenaline and noradrenaline) by chromaffin cells within the adrenal medulla (Charalampopoulos et al. 2005, 2004; Krug et al. 2009; Sicard et al. 2007). DHEAS favors the growth and development of neuroendocrine- (i.e., noradrenaline-) producing chromaffin cells (Krug et al. 2009; Ziegler et al. 2008). In addition, DHEA/S has been shown to increase the production of both dopamine and noradrenaline in an in vitro model of rat P12 chromaffin cells (Charalampopoulos et al. 2004), with the peak occurring within ten minutes of DHEA/S administration.

The significance of DHEA/S action on catecholamine production in humans is supported by a recent meta-analysis of adrenocortical, adrenomedullar, and sympathoneural response to a variety of stressors, indicating that epinephrine response was more closely related to ACTH than to noradrenaline response (Goldstein and Kopin 2008). The impact of increasing DHEA/S during childhood development on catecholamine production by chromaffin cells is reflected in a clear association of adrenarche and the differential production of adrenaline and noradrenaline. For instance, Utriainen et al. (2009) demonstrate higher levels of serum noradrenaline, but not adrenaline, in a sample of 73 Finnish children with signs of premature adrenarche compared with 98 age- and sex-matched controls. In addition, Weise et al. (2002) report an inverse relationship between levels of DHEAS and adrenaline in a sample of 80 boys and girls across the ages of five to seventeen. Unfortunately, they do not report on noradrenaline, which the previous results suggest would be positively related to DHEAS levels.

Whether changes in the baseline noradrenaline/adrenaline ratios have an implication for behavior during middle childhood is not clear. Both adrenaline and noradrenaline are released along with cortisol in response to stress, promoting glucose delivery to the brain (referred to as the sympatho-adrenal response; Hitze et al. 2010). In addition, both noradrenaline and cortisol are required for the production of habitual behavior in response to stress (Schwabe et al. 2010). An increased capacity for adrenal noradrenaline production during middle childhood as a result of increasing DHEA/S levels may result in the expression of habitual behavior in the face of stress. Habitual behavior may be adaptive under stress if it acts to preserve prior successful responses.

Adrenarche and Somatic Glucose Metabolism

Moving beyond its effects on catecholamine production, DHEA/S may have more direct effects on glucose metabolism. It is well known that cortisol acts on the liver to release glycogen, thus increasing glucose in circulation, a crucial source of energy for the brain (Fehm et al. 2006; Peters et al. 2004). DHEA/S appears to have the opposite effect. Not only has DHEA/S been shown to reduce hyperglycemia in genetically obese db/db mice (Aoki et al. 2004), it also appears to suppress enzymes involved in the production of glucose within the liver (Aoki et al. 2000). In a human hepatic cell line, DHEA administration also reduced gluconeogenic enzymes (Yamashita et al. 2005), suggesting that results in mice are applicable to humans. Taken together, these findings suggest that increasing levels of DHEA/S associated with adrenarche may increasingly inhibit the production of glucose by the liver in response to hypoglycemia.

In addition to effects on glucose production by the liver, DHEA/S may have an important impact in increasing glucose uptake by abdominal adipose tissue. Administration of DHEA stimulates the uptake of glucose in human and murine adipocytes in vitro by increasing the presence of glucose transporters GLUT1 and GLUT4 at the cell membrane (Perrini et al. 2004). On the basis of mRNA expression, Valle et al. (2006) demonstrate that human adipose tissue is capable of taking up DHEAS using specific OAT and OATP (organic anion-transporting polypeptide) transporters, as well as exhibiting the necessary enzymes, including steroid sulfatase (STS), to metabolize DHEAS into DHEA, thought to be the active form, within the cells. DHEA has also been shown to inhibit the glucose-stimulated releases of insulin from human beta cells at physiological doses (Liu et al. 2006), potentially decreasing the uptake of glucose by muscle and increasing its uptake by adipose cells.

Along with promoting glucose uptake in adipose tissue, DHEAS may play a role in regulating adipose tissue through an interaction with leptin. Leptin has been shown to promote DHEA production through its effects on steroidogenic enzymes in adrenal cells (Biason-Lauber et al. 2000). Furthermore, Machinal-Quélin et al. (2002) report a substantial increase in leptin production by subcutaneous adipose tissue as the result of DHEA administration in women, but not men.

Developmentally, the association of DHEA/S with glucose metabolism and leptin provides a potential mechanism underlying the relatively tight association between longitudinal changes in adrenarche and BMI (Remer 2000; Remer and Manz 1999), as discussed previously. Increasing levels of adipose tissue associated with the adiposity rebound would produce leptin, stimulating DHEA/S production by the adrenal gland. In turn, increased DHEA/S would promote the uptake of glucose by adipose tissue, leading to additional leptin production.

At the same time, the effects of increasing circulating DHEA/S on adipose tssue may be magnified by the effect of DHEA/S in promoting noradrenaline production over that of adrenaline. Both adrenaline and noradrenaline promote glucose metabolism in the liver, but adrenaline is much more effective at stimulating blood glucose levels than noradrenaline (Pernet et al. 1984). In addition, during exercise, adrenaline acts to break down adipose tissue, whereas noradrenaline does not (de Glisezinski et al. 2009). Together, decreased blood glucose and lipolysis would favor the storage of glucose in adipose tissue over glucose delivery to the brain.

In addition to providing a clear mechanism linking the adiposity rebound and the onset of adrenarche, an impact of DHEA/S on glucose production and adipose tissue would help to explain a more gradual change in allocation of glucose to the brain. The adrenal gland, liver, and abdominal fat are all relatively close to one another in the abdomen, whereas the brain is at some distance (albeit connected by the circulatory system). Thus, decreased hepatic glucose production and increased adipose disposal of glucose may lead directly to increased glucose uptake in adipose tissue, making less glucose available to the brain. In addition, the production of noradrenaline over adrenaline by the adrenal medulla may decrease the probability that, once converted to adipose tissue, energy would be remobilized for short-term needs, including brain metabolism.

It is important to note that the positive association between adipose tissue and DHEA/S production outlined in this section appears to contradict the usual picture of DHEA/S administration as having anti-obesity and anti-diabetic properties in rodents (Cleary 1991; Sánchez et al. 2008) and, more important, human adults (Villareal and Holloszy 2004, but see Basu et al. 2007). However, the majority of findings on DHEAS in humans are based on adults, who are much more likely than children to exhibit central obesity, which has been linked to endogenous DHEAS titers (Derby et al. 2006).

More specifically, Valle et al. (2006) suggest that declining levels of DHEAS together with increasing adipose tissue during aging may lead to increased metabolism of DHEA/S by adipose tissue, making less DHEA/S available to other target tissues, including the brain. By the same logic, among children, who exhibit relatively little adipose tissue (Benfield et al. 2008), a higher proportion of the DHEA/S produced may be available for uptake by organs other than adipose tissue. In particular, as mentioned previously, the kidney and liver exhibit OAT transporters capable of taking up DHEAS and transporting it across the cell membrane. Similarly, although DHEAS has been shown to inhibit the release of leptin from omental fat (Pineiro et al. 1999), given the low level of omental fat exhibited by children, such action seems unlikely to have a demonstrable effect.

Energetics and Brain Development during Middle Childhood

Along with changes in the allocation of energy between brain and body, the energy demands of the brain change during middle childhood. Analysis of brain glucose utilization using positron emission tomography (PET) with 2-deoxy-2 (18F) fluoro-d-glucose indicates that glucose utilization rates by the cortex and subcortical structures rise from birth to reach their peak at twice the adult rate at about eight years of age and then begin to decline (Chugani 1998). From eight to eleven years of age, glucose utilization remains elevated at about one and a half the adult rate and then decline more strongly to the end of puberty, when they reach adult levels (Chugani 1998). In contrast, glucose utilization in the brain stem does not appear to change over the same period.

This age-related pattern of cortical brain glucose utilization rates is thought to reflect a two-stage process of synaptogenesis and synaptic pruning. The initial rise in glucose utilization (Chugani 1998) is associated with a period of burgeoning synaptogenesis that has been directly related to glucose consumption (Rocher et al. 2003) regardless of whether the connections are subsequently maintained. By the same logic, the decline in cortical glucose utilization rates from the age of 8 years on (Chugani 1998) reflects the loss of synaptic connections, a process that accelerates starting around the age of eleven.

However, the decline in glucose utilization is not uniform across the cortex. Notably, starting at the age of six and extending into the twenties, the anterior cingulate cortex, and insula, parts of the limbic cortex, along with the thalamus, a subcortical structure, show elevated glucose utilization relative to the rest of the cortex (Van Bogaert et al. 1998). Together these three structures are part of a network centered on the right anterior insula, integrating somatosensory, homeostatic, and emotional information to produce a sense of subjective bodily awareness (Craig 2002, 2009). Information from nerve fibers throughout the body associated with temperature, mechanical pressure, energy status, pain, and gut are constantly integrated and updated within the right anterior insula to reflect the immediate physiological status of the body. In addition, inputs from the amygdala into the insula provide emotionally based information about potential threats to the body (Craig 2009).

Craig suggests that the global representation of the body in the right anterior insula reflects a continually updated physical status report of the body, portrayed as a sense of emotion. The absence of threats to bodily functioning, whether from hunger, illness, predators, or the actions of conspecifics, results in a sense of adequacy (well-enough being), whereas the need to respond to such perturbations is reflected in a sense of disquiet that informs the need for action to return the organism to a state of homeostatic (adequate) well-being. Importantly, the anterior cingulate cortex has a vital role in activating the peripheral autonomic nervous system responses associated with such processes (Critchley 2009).

Elevated metabolic rates in the insula, anterior cingulate cortex, and thalamus suggest a prolonged period of synaptogenesis in these regions. It is natural to speculate that such extended synaptic plasticity is associated with dynamic changes in the neural inputs to the insula that allows for the global representation of the body to be continually updated as the body itself continues to grow and develop.

Behavioral Development in Middle Childhood

Despite increased autonomy during middle childhood, it is interesting to note that among foragers, our best stand-in for the context of human evolution, middle childhood is not associated with adult levels of competence or self-sufficiency. Instead, much of the time is spent in play and idleness, and in interaction with other children. Even among the Hadza, where children are the most self-sufficient in terms of subsistence (see Kramer and Greaves 2011), by the age of ten, children only provide half of the calories they consume, and much of their hunting appears directed toward play rather than being results-oriented. Thus, from an evolutionary perspective, not only have the increased energetic costs of developing the human brain during middle childhood long been subsidized by adults, but synaptic plasticity during this period may have been shaped more by self-directed play than by functional competence in subsistence activities.

Interestingly, activation of the insula and anterior cingulate cortex in adults is not only related to somatic status, but also to the perception of fairness, especially with regard to the distribution of social resources and power (Hsu et al. 2008, Chiao et al. 2009; Tabibnia et al. 2008). Singer (2007) suggests that the involvement of the insula represents the logical abstraction of bodily awareness into the social sphere, or “social embodiment.” Given the level of food sharing evident among hunter-gatherers (e.g., Marlowe 2004), fairness in the distribution of resources can have direct impact on bodily survival. Recently, Crowley et al. (2010) report the involvement of the insula and anterior cingulate cortex in rejection, but not turn-taking, among children while playing Cyberball,Footnote 1 a computer game designed to simulate group dynamics, including turn-taking and exclusion.

Thus, the increased independence and behavioral autonomy exhibited during middle childhood (Weisner 1984; White 1996) adds an additional level of complexity to the demands of human brain development. In addition to continually monitoring and updating the internal physiological condition of the body to ensure its viability, the insula and anterior cingulate cortex together integrate the immediate implications of social interactions through their impact on the same physiological processes. For instance, perception of heartbeats serves as an important marker of emotional response recorded in the insula and at the same time reflects the control of the autonomic nervous system by the anterior cingulate cortex (Critchley 2009).

The anterior cingulate cortex is at the center of conflict resolution and/or behavioral inhibition (Posner and Rothbart 2007). By the age of seven, children have reached adult levels of executive function on simple laboratory tasks (Rueda et al. 2004), suggesting that they are able to inhibit primary behavioral impulses (Posner and Rothbart 2007) in favor of more distant goals. Such inhibition is clearly important in promoting prosocial behavior. For instance, Boes et al. (2008) report that the variation in the volume of the right anterior cingulate cortex is related to the expression of aggression and defiance in a normal sample of 61 boys, seven to seventeen years of age.

The insula, on the other hand, is central to anxiety (Paulus and Stein 2006, 2010). Neuroimaging studies suggest that while the perception of threat reflects amygdalar activiation, insular activation reflects anticipation of the bodily consequences of threat (Paulus and Stein 2006). Comparison of signals arising from the amygdala and insula within the anterior cingulate cortex leads to appraisal of whether or not the threat requires a response. If the signals from the amygdala and insula are congruent, the amygdalar signal can be disabled. However, if the two inputs are discrepant, the DLPC is activated to provide a cognitive solution. This system suggests Nesse’s (2001) smoke-detector theory, with the amygdala as the smoke detector and the anterior cingulate cortex as the automatic override.

Finally, during middle childhood, development of the prefrontal cortex is of particular interest for its role in executive function, inhibitory control, and cognition (Diamond 2002), critical traits for the development of reason and responsibility and “making sense.” Given that the development of the prefrontal cortex reflects the emergence of large-scale integrative neural networks (Casey et al. 2005; Fair et al. 2009; Luciana and Nelson 1998), declining cortical glucose utilization rates during middle childhood reflect the development of longer-range neural pathways (Fair et al. 2009).

Recent findings in adults suggest that psychosocial stress can disrupt prefrontal connectivity and impair cognition (Liston et al. 2009). However, in children the impact of stress may not so much disrupt brain function as shape it to the prevailing social conditions (Boyce and Ellis 2005; Flinn 2006). In particular, stress may bias prefrontal development toward increased anxiety and threat detection (see Hadwin et al. 2006). The ability to disable alarm and inhibit anxiety may be intimately connected to the development of connectivity between the prefrontal cortex and the amygdala (Pezawas et al. 2005). As suggested by White’s age of reason and responsibility, tasks of middle childhood such as making change or choices about which strangers to approach may be facilitated by a reduction in perceived threats and anxiety.

Taken together, the development of the anterior cingulate cortex and the insula appear to play a role in the emergence of reason and responsibility and “making sense” during middle childhood (Lancy and Grove 2011). The insula provides the underpinning of bodily perception that can be abstracted into an understanding of social relationship with others, while the anterior cingulate cortex is crucial to expressing one’s bodily self-interest in socially appropriate ways. Importantly, the development of “making sense” continues through middle childhood into young adulthood, a fact reflected in the ongoing maturation of the prefrontal cortex (Gogtay et al. 2004; Shaw et al. 2008), as well as the insula and anterior cingulate cortex (Van Bogaert et al. 1998), into the early twenties.

I have attempted to demonstrate that adrenarche is potentially associated with changes in energetic allocation to the body vs. the brain and that the extended development of the cortex, particularly the insula, thalamus, and anterior cingulate cortex, from the age of about seven to the early twenties, appears to parallel increases in DHEA/S that begin with adrenarche, suggesting a potential connection between the two. In the next section I address the specific mechanism by which DHEA/S might influence brain and/or behavioral development. I suggest that DHEA may protect energetically active (i.e., neuroplastic) neurons against reductions in energy availability and promote the development of apocrine glands associated with smells which serve as social markers of the juvenile period and may play a role in altering the response of parents to the behavior of their children during middle childhood.

DHEA, Energetics and Brain Development

Individual variation in the onset of growth of the zona reticularis from 3 to 8 years of age (Dhom 1973) spans the period of maximal glucose utilization rates and synaptogenesis from 4 to 8 years of age in the developing cortex of the brain (Chugani 1998). However, the low levels of DHEA in circulation (Remer et al. 2005) during this period suggest that little of the DHEA/S produced by the adrenal gland reaches the brain. In fact, as mentioned earlier, the remarkable range of individual variation in the onset of zona reticularis suggests that the onset of DHEA/S production is not sufficiently predictable to represent a tightly regulated developmental process. Thus, prior to the age of 7 or 8 years, at which age all individuals have started producing substantial amounts, DHEA/S seems unlikely to be related to the development of the brain or other organs in any systematic way.

On the other hand, increases in circulating DHEAS levels after adrenarche and continuing until the early twenties (Orentreich et al. 1984; Sulcova et al. 1997) represent a process sufficiently predictable that it may indicate a relationship to the development of other organs, including the brain. The facts that DHEA does cross the blood-brain barrier (Guazzo et al. 1996) and that levels of DHEAS in cerebrospinal fluid (CSF) are correlated with those in the cortex (Naylor et al. 2008) suggest that, with adrenarche, increasing levels of DHEA/S in circulation translate to an ongoing increase in the brain’s exposure to DHEA/S.

The impact of DHEAS on human brain development has received remarkably little attention. However, evidence for neurological effects in rodents suggests mechanisms by which DHEA/S may alter neurodevelopment in humans as well. These include increased mitochondrial energy production (Patel and Katyare 2006a, b), changes in neurotransmitter turnover (Perez-Neri 2008), and neuroprotection (Li et al. 2009).

Of these actions, increased mitochondrial energy production in the brain of developing rats (Patel and Katyare 2006a, b) has the clearest implications for human brain development. Since mitochondria are involved in all aspects of neuronal metabolism (Kann and Kovács 2007), exposure to DHEA/S may lead to increases in the neuronal energy budget. Under the principle of allocation, an increase in total neuronal energy can be expected to promote all aspects of neural function, including neuronal survival, neurotransmission, and synaptic plasticity.

In fact, recent findings suggest that mitochondrial function is critical to neurotransmission. Mitochondrial trafficking means that mitochondria are present close to the synapse where energy is required for synaptic transmission (see MacAskill and Kittler 2010 for a recent review). Furthermore, the release of neurotransmitters from reserve vesicles during neuronal firing is thought to be directly dependent on mitochondria (Vos et al. 2010). Neuronal firing is essential for synaptogenesis, suggesting a direct mechanism by which DHEA/S may play a role in activity-dependent synaptogenesis during middle childhood.

Similarly, in its role as an antioxidant, DHEA/S has been shown to protect against the neurotoxic effects of diabetes in rats (Yorek et al. 2002). Cortisol plays a key role in the delivery of glucose to the brain (Peters et al. 2004) and promotes reductive oxygen species (ROS) (McIntosh and Sapolsky 1996; McIntosh et al. 1998), linking energy use with the presence of ROS. Thus the neuroprotective effects of DHEA/S may be most salient in metabolically active parts of the brain.

During middle childhood, metabolic activity in the brain above and beyond adult levels reflects synaptogenesis (Muzik et al. 1999). Given increases in cortisol starting around the age of 8 years (Wudy et al. 2007), about the time that glucose utilization begins to decline, concurrent increases in DHEA/S may protect the development of synaptogenesis in active neurons against cortisol-induced neurotoxicity, allowing for continued synaptogenesis in some cortical regions while others experience synaptic pruning.

Finally, DHEA/S has also been linked to a variety of different aspects of neuronal functions, including neuronal growth (Suzuki et al. 2004), modulation of neurotransmitters (Pérez-Neri et al. 2008; Zheng 2009), and neuronal survival (D’Astous et al. 2003; Li et al 2009). Many of these results have been obtained with fetal neurons; whether or not they also apply to neurons later in life needs to be demonstrated. Furthermore, it is not clear if the impact of DHEA/S on neuronal growth, neurotransmitter modulation, and survival reflect the impact of DHEA/S on pathways specific to those functions or the indirect effect of increased mitochondrial energy production, as discussed above.

Behavioral Effects of DHEAS

Unfortunately, there is little evidence on the effects of DHEA/S on human neurodevelopment against which to consider potential behavioral manifestations of the neuronal mechanisms discussed above. Van Goozen et al. (1998, 2000) found elevated DHEAS in two small samples of boys with oppositional defiant disorder (ODD) relative to normal controls. In one study the intensity of aggression was related to DHEAS as well (Van Goozen et al. 1998). Strous et al. (2001) report an inverse relationship between DHEAS and symptom severity in a sample of 29 boys ages seven to fifteen diagnosed with ADHD. However, these studies do little to suggest specific mechanisms by which DHEAS might have an impact on behavior.

Dorn et al. (2008) have recently shown, based on a sample of 40 girls 6 to 8 years of age with premature adrenarche (PA), that compared with normal controls, those with PA are heavier and exhibit elevated levels of DHEA, but not cortisol. These findings are as expected, given findings linking DHEA to body composition discussed earlier. At the same time, the girls with PA exhibited significantly higher levels of anxiety and oppositional disorder relative to controls, suggesting that DHEA may increase not only anxiety, but also social disinhibition.

These studies suggest that elevated DHEA/S can promote anxiety, hyperactivity, and aggression in children, consistent with reports of a positive relationship between DHEAS and anxiety (Hsu 2006) and DHEA and mania (Dean 2000; Markowitz et al. 1999) in adults. Whether similar relationships obtain among children with normal levels will require further investigation. For instance, the relationship between DHEA and increased anxiety and oppositional behavior in the PA girls might reflect elevated levels of DHEA or longer-term effects of elevated DHEA levels acting on an immature brain.

Social Signals of Middle Childhood

Middle childhood is associated with a period of sex segregation in which individuals spend increasing time with same-sex peers (Maccoby 1998). It has been suggested that adrenarche is associated with the development of same-sex attraction and/or attachment (Del Giudice 2009; del Guidice et al. 2009; McClintock and Herdt 1996). However, unlike puberty, in which the development of both secondary sexual characteristics and sexual motivation (Campbell et al. 2005; Halpern et al. 1998) have been tied to circulating hormone levels, adrenarche does not produce the same clear phenotypic signals of sexual differentiation, meaning that any such increase in personal attraction would depend almost entirely on changes within the attractee.

Adrenarche is not, however, without any external markers. DHEA/S is implicated in the development of human pubic hair (Binder et al. 2009), sweat and sebaceous glands (Stewart et al. 1992), and elements of the hair follicle, as well as the skin (Thiboutot et al. 2003). DHEAS is also found in axillary sweat (Labows et al. 1979). Furthermore, changes in the composition of sebaceous wax during middle childhood have been linked to DHEA (Stewart et al. 1992; Yamamoto and Ito 1992, 1994). Girls with premature adrenarche are reported as having a pronounced “axillary odor” (Kaplowitz et al. 1986), though this is not true of all children. Still, changes in hair, sweat, and skin may serve as phenotypic markers, and potential social signals, of the transition to and through middle childhood.

The most important of these markers may be sweat, because of its role in body odor. Human odor has been shown to reflect the breakdown of sebaceous gland compounds by the action of bacteria (Natsch et al. 2006, 2004). Analysis of the volatile compounds in axillary sweat indicates that, taken together, they provide a distinctive individual profile (Penn et al. 2007). In fact, analysis of volatile carboxylic acids, one such class of compounds, from monozygotic twins suggests that variation in odor related to these compounds is heritable (Kuhn and Natsch 2009), making human body odor not only an individual marker, but a potentially reliable marker of kinship as well. Furthermore, a lack of personal hygiene during much of human evolution would presumably have accentuated odor as a social signal.

Several studies suggest that even under hygienic conditions, humans can distinguish kin from non-kin on the basis of body odor. Mothers and their infants have been shown to recognize each other’s body odor whereas husbands and wives could not (Porter et al. 1985). Weisfeld et al. (2003) report that preadolescent subjects can distinguish full sibs from half sibs on the basis of smell, while mothers can identify their biological, but not step, children. Olsson et al. (2006) report that 52% of 37 tested high school students were able to distinguish their own body odor and 39% the odor of their friends by smelling a T-shirt that had been worn for three consecutive nights. Though variable across individuals, awareness of familiar vs. unfamiliar body odors could act as a signal of social context during middle childhood as children begin developing greater social autonomy and interacting with a wider range of familiar and unfamiliar individuals.

Interestingly, Dubas et al. (2009) report that the quality of parents’ interactions with their children can be related to the child’s body odor. Based on a sample of 68 Dutch families, all with one child between the ages of 8 and 9 years, they found that mothers were more likely to punish sons with odors that they rated as more pleasant, while fathers spent more time with offspring whose odor they recognized as opposed to those whose odor they did not recognize.

The effects of individual body odor on human interactions may reflect the impact of axillary secretions on emotional responsivity and attention. Lundström et al. (2003) report that androstadienone, a reputed human pheromone found in axillary sweat, leads to increased emotional attention in women, even without conscious detection of any odor. Hummer and McClintock (2009) found increased attention to emotional information in both men and women, in response to androstadienone. Other studies have demonstrated that axillary secretions released during anxiety can specifically lead to increased anxiety in others (Albrecht et al. 2011; Haegler et al. 2010).

Importantly, the neural processing of human body odors and associated kin recogntion appears to differ from those of common odors in incorporating the frontal temporal junction, the insula, and the dorsomedial prefrontal cortex (Lundström et al. 2008, 2009). Induction of anxiety in response to anxious sweat also involves the insula (Prehn-Kristensen et al. 2009), as would be expected given the role of the insula in empathy (Lamm and Singer 2010). Furthermore, given the role of the insula in integrating somatic sensations (Craig 2002, 2009), its role in the detection of body odor suggests that body odor (own and others) provides information salient to bodily awareness.

Given the role of DHEA/S in development of sweat glands, the expression of body odors may play a role in shaping the response of adults to changes in children’s emotional states as they reach and pass through middle childhood. Unlike behavior, which requires conscious awareness on the part of the receiver, odor may act as a signal without any conscious awareness on the part of parents and other adults. As such, it would provide a more robust signal of a child’s developmental status.

Summary

Middle childhood, roughly 6 to 12 years of age, represents a stage of human social and psychological development characterized by increasing self-sufficiency and the capacity for “making sense.” Somatically, its onset is marked by the eruption of permanent molars, adiposity rebound, and end of growth in brain volume, as well as adrenarche, the emergence of adrenal production of DHEA and DHEAS around the age of seven. Parallels between the timing of human brain maturation and increases in DHEA/S beginning at adrenarche, and the role of DHEA/S, if any, in brain and behavioral development during middle childhood, are unclear.

Adrenarche has been linked to changes in body composition close to the onset of middle childhood. The physiological basis of this association is not yet clear, but it may be related to the impact of DHEA/S on glucose metabolism, biasing energy toward adipose storage and away from circulation where it can fuel the brain. DHEA/S may also act with the adrenal gland itself to coordinate the development of the adrenal medulla’s release of adrenaline/noradrenaline, magnifying the physiological impact of age-related increases of the steroid over the course of middle childhood.

In addition, as an antioxidant, DHEA/S may promote extended neuroplasticity during middle childhood. By providing neuroprotection against reactive oxygen species (ROS) associated with the elevated glucose metabolism necessary to support synaptogenesis, DHEA/S may allow for the extended development of specific cortical regions in the brain. Some of these regions, including the insula and the anterior cingulate cortex, provide a continuously updated global representation of the body as it grows and develops.

Finally, the role of DHEA/S in promoting sweat glands may signal a change in chemical communication at adrenarche. Whether converted to axillary odor or carrying pheromones such as androstadienone, the resulting sweat may become a social signal of the child’s transition into and through middle childhood. Increased attention to changes in children’s anxiety during this phase of increasing social autonomy and competence may bias parental investment in their offspring’s long and slow development toward full maturation. Researchers similarly may want to pay more attention to this unique phase of our species.