Keywords

All mammal infants are dependent and helpless, but to varying degrees. Newborns of precocial species such as horses, mountain goats, and gazelles have all of their senses working and can stand and walk on wobbly legs just hours after birth. Neonates of more altricial species such as cats, rats, and dogs are born effectively blind and deaf (although their senses of touch and olfaction are much better developed) and can locomote just enough to make their ways to their mothers’ nipples. Primates, in general, have relatively well-developed senses and large brains at birth, but limited motor abilities. For example, chimpanzees (Pan troglodytes) and bonobos (Pan paniscus), humans’ closest genetic relatives, don’t sit up on their own until 3–4 months and don’t walk until 6 or 7 months (Bründl et al., 2020; Kuroda, 1989). Humans follow a similar developmental trajectory, with all senses functioning to some degree at birth (although visual acuity is poor and improves over the first year in both humans and chimpanzees, Bard et al., 1995), although they take even longer to reach milestones of gross motor development (Bründl et al., 2020). The great apes in general follow a slow life history course. Parents (especially mothers) invest heavily in few offspring, who reach maturity relatively late in life. This trend is exaggerated in Homo sapiens, whose females reach sexual maturity at 16.5 years of age (on average among traditional groups), compared to 6.5 years for gorillas (Gorilla gorilla), 7.0 years for orangutans (Pongo pygmaeus), and 9.8 years for chimpanzees (Harvey & Clutton-Brock, 1985). Taking so long to reach adulthood and sexual maturity has its obvious shortcomings, particularly death before reproducing, and must therefore have some adaptive benefit, lest it would have been eliminated by natural selection. Most scholars argue that extending immaturity affords large-brained animals with opportunities to develop their survival skills and learn the complexities of their physical or social environments (Alexander, 1989; Bjorklund, 2021; Dunbar, 2003; Whiten & Erdal, 2014). Humans represent the extreme of slow primate development, and this process begins in the earliest stages of life.

1 Heterochrony as an Engine of Evolutionary Change

Homo sapiens’ slow road to maturity begins early, in infancy, with human infants taking longer to achieve many developmental milestones than their simian relatives, and presumably than their hominin (group consisting of modern humans and their bipedal ancestors) predecessors. The retention of infantile or juvenile traits, including the rate of development into later life, is referred to as neoteny. Neoteny is a form of heterochrony, genetic-based differences in developmental timing, which is a central concept in the field of evolutionary developmental biology, or Evo Devo (Carroll, 2005, 2017; Raff, 1996). Evo devo explores how different developmental mechanisms affect evolutionary change. In general, aspects of development can be accelerated or retarded relative to that of an ancestor (McKinney & McNamara, 1991). Moreover, different parts, or modules, of an animal are relatively independent from one another, such that natural selection can operate independently on different modules at different times. This is often accomplished through the expression of regulatory genes , which, unlike structural genes, do not code for proteins but rather determine when and whether structural genes are activated and how much protein they produce. According to Carroll (2005), “It is the switches [regulatory genes] that encode instructions unique to individual species and that enable different animals to be made using essentially the same tool kit” (p. 211).

Although humans have often been described as a neotenous species (e.g., Bjorklund, 2021, in press; Bolk, 1926; Gould, 1977; Montagu, 1989), human evolution has clearly involved aspects of both acceleration and retardation of development relative to our ancestors. For example, although human infants are slower to develop gross motor abilities than chimpanzees (and presumably our last common ancestor with chimpanzees), they tend to be weaned earlier than chimps (between 2.5 and 3 years versus 4.5 and 5 years, Bogin, 2006), which results in a shortened inter-birth interval for human mothers. Moreover, as I’ll discuss in greater detail later in this chapter, brain development is faster (accelerated) during the prenatal period in human fetuses relative to other primates, with this rapid rate of growth being maintained postnatally in humans relative to the other great apes (a form of retardation). Thus, human evolution reflects a mosaic pattern of different traits arising at different times, often associated with heterochronic changes, some reflecting acceleration and others retardation of ancestral rates and characteristics.

2 Neoteny in Human Evolution

Although human evolution has clearly involved both acceleration and retardation of ancestral traits, my focus in this chapter will be on the former, neoteny, especially its role in human infancy.Footnote 1 It may sound counterintuitive that evolutionary advances can sometimes be achieved by retaining infantile or juvenile traits into later development. Yet, neoteny can be a source of evolutionary innovation. According to Gould (1977), “the early stages of ontogeny are a storehouse of potential adaptations, for they contain countless shapes and structures that are lost through later allometries. When development is retarded, a mechanism is provided (via retention of fetal growth rates and proportions) for bringing these features forward to later ontogenetic stages” (p. 375).

As an example of neoteny in action, consider the work of the Russian zoologists Dmitry Belyayev and Lyudimila Trut, who bred wild red foxes (Vulpes vulpes) attempting to replicate the process that had converted wolves into domesticated dogs (Trut, 1999; Trut et al., 2009). The researchers selectively bred foxes for tameability, mating the most human-receptive females with the most human-receptive males. After 20 generations 35% of the foxes were classified as “domesticated elites,” acting more like domesticated puppies than wild foxes. This domestication unexpectedly produced a suite of immature physical features, including floppy ears, wider and shorter heads, and shortened tails, snouts, legs, and upper jaws. Trut and her colleagues commented that, “The shifts in the timing of development brought about by selection of foxes for tameability have a neotenic-like tendency: the development of individual somatic traits is decelerated, while sexual maturation is accelerated” (Trut et al., 2009, p. 354). Along a similar vein, some scholars have suggested that humans are a self-domesticated species, in which neotenous traits were selected for decreased reactive aggression and increased prosociality and cooperativeness (Bjorklund, 2021, in press; Hare, 2017; Hood, 2014; Wrangham, 2019).

Let me provide an example more directly relevant to human evolution. In most adult mammals the backbone connects to the opening in the back of the skull (the foramen magnum) so that the angle at which the spine connects to the skull (called the cranial flexure) is such that the animal is facing forward when walking. This position was different in the fetal period, with the head sitting more or less on top of the spine. During prenatal development, the cranial flexure shifts in most mammals, producing the adult quadrupedal orientation. This shift does not occur as substantially in humans, however, so that the head of Homo sapiens is looking forward when standing on two legs. Thus, by retaining the embryonic relation between the skull and the spine, human heads sit atop their spines, permitting bipedal locomotion (see Gould, 1977; Montagu, 1989). Table 2.1 provides a list of some human neotenous traits.

Table 2.1 Some neotenous functional traits in humans

My focus in the remainder of this chapter will be on neotenous features of infancy and early childhood that promoted adaptation to local environments, promoting survival, and can be seen in present day human babies and toddlers. I begin looking at motor and physical features, followed by an examination of the role of neoteny in human brain evolution, and finally aspects of “cognitive neoteny,” immature features of infants’ neural and cognitive processing that help babies adjust to their current ecological niche or serve as the basis for more advanced cognition.

3 Neotenous Motor and Physical Features and Their Consequences for Infant Survival

Human infants’ extended period of physical dependency is a problem. Although all mammal infants are dependent on their mothers for nutrition, nurturing, and protection, this dependency is exaggerated in Homo sapiens. Human babies cannot cling to their mothers’ fur or grasp around their mothers’ necks, but must be carried and supported by their mothers for an extended period of time. Even once walking, human infants lack the physical dexterity and mental capacities to be left alone for any length of time. In fact, this dependency extends into childhood, a period between weaning and the eruption of permanent teeth (between about 3 and 7 years), which Bogin (2021) contends is a stage unique to humans between infancy and the juvenile stages (between about 7 years and puberty, called middle childhood by psychologists and educators). Thus, ancestral infants and toddlers required near-constant and extended care, and this factor may have been responsible, in part, for male-female pair bonding (a mother cannot adequately care for a child alone, necessitating help from the father) and for humans’ adoption of cooperative breeding, with a host of mostly female relatives assisting in the care of infants and young children (Hrdy, 1999; see DeSilva, Chap. 4; Locke & Bogin, Chap. 6; Hrdy & Burkart, Chap. 8, this volume).

3.1 The Effects of Kindchenschema

Infants’ extended period of dependency and motor immaturity may necessitate that they receive care from adults, but other aspects of physical immaturity increase the likelihood that they will receive that care. John Bowlby (1969), the father of attachment theory, observed that infant-parent attachment, critical for the survival of a helpless, long-dependent baby, is fostered by a number of infant features, including cries, smiles, movements, and immature facial features. The Nobel Prize-winning ethologist Konrad Lorenz (1943) noted a suite of facial features possessed by infants that promote feelings of affection and caregiving in adults, particularly mothers. Lorenz referred to these features as kindchenschema or baby schema. Compared to adult faces, infant faces have flat noses, fat cheeks, rounded heads that are large relative to body size, small chins, and large eyes relative to head size (e.g., Almanza-Sepúlveda et al., 2018). Everything else being equal, adults are more attentive to baby faces than to those of adults (e.g., Brosch et al., 2007; see Kringelbach et al., 2016 and Lucion et al., 2017 for reviews). One possible reason for adults’ preference for babyness is that these cues are relatively honest signs of fitness and health of infants.

The effects of the baby schema have been investigated in more than 100 studies over the last 70 years, and although there is some variability in the findings, most studies confirm Lorenz’s hypothesis that kindchenschema evolved to promote attachment to and nurturance from adults, all in the quest of surviving the perilous stage of infancy. Using a variety of measures, babies with cuter faces (i.e., faces with greater degrees of babyness) are associated with increased interest and caring from adults than less-cute infants. For example, adults evaluate infants with high levels of baby schema as friendlier, more sociable, more attractive, and easier to care for than less-baby-faced infants (e.g., Alley, 1981; Leibenluft et al., 2004; Senese et al., 2013; Sprengelmeyer et al., 2009). Adults display greater empathy for and have more affectionate interactions with cuter than less-cute babies (e.g., Langlois et al., 1995; Glocker et al., 2009a; Machluf & Bjorklund, 2016); they also express greater motivation to care for, and make favorable hypothetical adoption decisions toward, cuter than less-cute infants (e.g., Aradhye et al., 2015; Volk et al., 2007; Waller et al., 2004; see Franklin & Volk, 2018 for a review). This preference for baby-faced infants extends to members of different races (e.g., Caucasian adults respond similarly to cues of cuteness in Caucasian, African, and Asian infants), and, as any puppy or kitten owner knows, we even respond to cuteness cues in other species (e.g., Esposito et al., 2014; Golle et al., 2013). Consistent with the idea that baby-faced features promote affection and caregiving, premature infants, who display fewer of the kindchenschema features, are more apt to be abused at points during childhood than full-term infants (see Martin et al., 1974).

The effects of the baby schema have been documented in neuroimaging studies. For example, viewing infant faces typically produces faster neural responses involving more brain regions than when viewing adult faces (Glocker et al., 2009b; Hahn & Perrett, 2014; Leibenluft et al., 2004). Viewing infant faces elicits greater reactions in brain areas associated with processing emotion (Baeken et al., 2010; Glocker et al., 2009b; Nitschke et al., 2004), with Luo et al. (2015) concluding that “overall infant faces evoke [from adults] both stronger arousal and enhanced responses to both positive and negative cues from the infant” (p. 10). Research has also found that viewing baby faces is associated with greater activity in brain regions associated with empathy (e.g., Glocker et al., 2009b; Leibenluft et al., 2004), reward and attachment (e.g., Leibenluft et al., 2004; Nitschke et al., 2004), and motor behavior (Glocker et al., 2009b). The increased neural activation in motor areas when viewing baby faces may result in adults being extra careful when interacting with babies, as suggested by studies in which adults improved their fine-motor abilities, as well as expressions of tenderness and calmness, after viewing cute infant faces (or faces of puppies and kittens) (Sherman et al., 2009, 2013). Overall, looking at infant faces with high baby schema produces greater activation in brain areas associated with processing faces, emotion, and attention than viewing low-baby-schema faces (Luo et al., 2015).

One might think that the effects of babyness would be strongest at birth, when infants are most vulnerable and in need to care. This clearly makes good evolutionary sense from the infant’s perspective, but research has shown that babies are viewed as most cute between 3 and 6 months of age. Many newborns have misshapen heads from their trip through the birth canal, and babies born prematurely have more atypical features and lower body weight than full-term infants. In studies by Franklin and her colleagues (2018), adults were asked to rate photographs of newborns, 3-month-olds, and 6-month-olds in terms of cuteness, health, and happiness, and to make hypothetical adoption decisions about the babies. Although babies of all ages were rated relatively high on these dimensions (average ratings ranged from 3.06 for happiness to 4.08 for adoption decisions on a five-point scale), adults rated the 3- and 6-month-old infants higher than the newborns on each measure. One explanation for this finding is that, although it may be in the infant’s best interest to garner as much attention and caring from adults as soon as possible, it may not be in the best interest of the parents. Ancestral parents, especially mothers, often had to evaluate whether and how much to invest in a newborn. In situations where resources are scarce and parents have other offspring to care for, it may be in the mother’s best interest to abandon a sickly child. So whereas it may not have been a benefit to neonates for their parents to objectively evaluate the health and fitness of an infant, it likely was to the parents’ advantage (see Salmon & Hehman, Chap. 9, this volume).

Although the benefits of babyness may be greatest in the first year of life, they persist into early childhood. For example, in one study adults recommended less-severe punishment for more baby-faced 4-year-olds than for children with more mature faces (Zebrowitz et al., 1991). However, the positive effects of baby schema extend only to 4 or 5 years of age, after which adults’ judgments of attractiveness and likeability are similar to their judgments for adult faces (Luo et al., 2011). Other research has shown that neurological responses to faces vary as a function of the age of the person shown in the photograph, with images of infants having the greatest activation, followed by photos of prepubescent children, and the least activation for photos of adults (Proverbio et al., 2011).

3.2 Sex Differences in the Effects of Kindchenschema

The effects of the baby schema are found in both men and women, but they are greater in women, who historically (and surely prehistorically) have done the bulk of the infant care (e.g., Cárdenas et al., 2013; Sprengelmeyer et al., 2009; Yamamoto et al., 2009; see Hart, Chap. 7; Simpson & Jaeger, Chap. 11, this volume). For instance, in a series of experiments, Sprengelmeyer et al. (2009) varied the kindchenshema features of infants’ faces and asked men and women to select which of two infant faces (one having more babyness features than the other) was cuter. Although women, overall, were more sensitive to subtle differences in cuteness than men, women’s selections varied with their age. Figure 2.1 shows the judgments of cuteness (selecting the photo with the higher cuteness value) for men and women of different ages. Women between 19 and 26 years of age were the most sensitive to cuteness cues, followed closely by women 45–51 years of age. In contrast, older women, 53–60 years of age, were less able to distinguish the high-baby-schema faces and were comparable in their performance to both younger and older men. That these results were linked to hormone levels was supported by subsequent findings that age-matched premenopausal women were more accurate at selecting the cuter faces than postmenopausal women, and that women using hormonal contraception were better able to distinguish the cuter baby than women not on hormonal birth control (Sprengelmeyer et al., 2009).

Fig. 2.1
figure 1

Mean accuracy (±1 SE) in cuteness-discrimination task as a function of the difference in cuteness between the faces for men and women of different ages (Study 1). (From Sprengelmeyer et al., 2009)

Other research has found that sex differences in sensitivity to babyness is first seen in early adolescence, consistent with the hypothesis that it is related to changes in hormone levels (e.g., Borgi et al., 2014; Fullard & Reiling, 1976; Goldberg et al., 1982; Gross, 1997). For example, Goldberg et al. (1982) showed 12- and 13-year-old girls and same-age boys slides of infant and adult faces and asked them to choose which faces they preferred. Half of the girls had had their first period (postmenarcheal) and half had not (premenarcheal). The postmenarcheal girls preferred the infant faces significantly more often than both the premenarcheal girls and the boys, who did not differ from one another. This pattern is consistent with idea that a strong preference for baby-faced features, especially in girls, may reflect a preparation for parenthood, or at least did for our ancestors.

4 Neoteny and Brain Development

Infants’ immature, or neotenous, physical features helped keep ancestral babies alive by endearing them to adults and promoting attachment. This, of course, requires adaptations in adults (chiefly mothers) to be responsive to infants’ cues of dependency, which serve to benefit the inclusive fitness of both parents and offspring. As critical as these features are to survival, they are not unique to humans, as other species have evolved biparental families, cooperative breeding, and respond positively to immature features of their infants. What is special about humans is their cognition and social acumen. Homo sapiens possess symbolic representation, language, and tool-using and problem-solving abilities that, while possibly not unique in the animal kingdom, are far more advanced than those shown by any other creature. And although Homo sapiens surely evolved from other social primates, humans’ degree of social sophistication – abilities to learn from and cooperate and compete with social others – qualifies them as a hypersocial, or eusocial, species, analogous in some ways to the social insects (e.g., Tomasello, 2014, 2019; Wilson, 2013). One currently popular theory of human evolution is the social brain hypothesis, the idea that increased social cognition was a (perhaps the) driving force in human evolution and necessitated a large brain to handle the variety and complexity of human communities (e.g., Alexander, 1989; Bjorklund & Bering, 2003; Dunbar, 2003). The building of such a brain was achieved, in large part, by heterochronic changes in the timing and rate of development, beginning prenatally and continuing through infancy.

The adult human brain is about three times the size of chimpanzee and bonobo brains, as well as the brains of our early hominin ancestors, both in absolute size and in relation to body size (Jerison, 2000; see Wilder & Semendeferi, Chap. 3, this volume, for an excellent summary of infant brain development and evolution). The human brain got that large in part by producing more neurons and accelerating growth prenatally (forms of heterochronic acceleration) and by retaining the prenatal rate of brain growth postnatally (a form of heterochronic deceleration, or neoteny).

Although natural selection favored increased brain size in the hominin line eventuating in humans, there were morphological limits in how large a neonate’s skull (and thus brain) could be and still fit through the birth canal of a bipedal female. (Bipedality evolved prior to increased brain size in hominin evolution.) This is referred to as the obstetrical dilemma (Washburn, 1960). As a result, at birth human infants’ brains, while absolutely large in size compared to their simian relatives, are smaller relative to their eventual adult size than those of other primates. In most primates, babies are born when their brains are on average 47% of their eventual adult weight (DeSilva, 2016; Trevathan & Rosenberg, 2016). In contrast, human babies are born when their brains are about 28% of their eventual weight (DeSilva, 2016; see DeSilva, Chap. 4, this volume).Footnote 2

Following birth, human infants retain the rapid, prenatal rate of brain growth in terms of size of neurons, formation of dendritic connections, and myelination (Liu et al., 2012; Marchetto et al., 2019; Miller et al., 2012). By 6 months of age, infants’ brains achieve about 50% of their adult weight. This increases to 75% at 2 years, 90% at 5 years, 95% at 10 years, with the final 5% of growth not being completed until late adolescence or early adulthood. Chimpanzees, in comparison, attain adult brain size by about 5 years of age (Bogin, 2006). Thus, even when humans develop “more brain” than other primates, they do it, in part, by retaining fetal growth rates long after birth.

Despite the “early” birth, human infants’ brains – that are still large in an absolute sense – were surely a risk for our hominin ancestors. Death in childbirth, for both the mother and infant, was a real possibility for most of human history and prehistory, and likely because of this, women in most hunter-gatherer cultures give birth with the assistance of other women (e.g., Trevathan, 1987; See DeSilva, Chap. 4, this volume). Given the risk associated with giving birth to such a large-brained baby, there must have been substantial benefits, or it would have been eliminated by natural selection.

The benefits were surely in what larger and more complex brains can achieve in terms of technical and social abilities. But brains not only evolved but also develop, and the particular pattern of infant brain development set the stage for the many abilities that characterize human adult brains. By extending brain growth that would normally occur in the protective wombs of their mothers following the typical primate developmental pattern, human infants are exposed to a vastly different set of experiences. Their brains continue to develop rapidly while receiving visual, auditory, tactile, olfactory, vestibular, and social stimulation that a typical primate would not receive until its brain was substantially more developed. Scholars have long proposed that human infants’ gestation is essentially extended into postnatal life and have referred to this period by a number of terms, including extrauterine spring, exterior gestation, exterogestation, and the fourth trimester (e.g., Konner, 2010; Montagu, 1989; Portmann, 1944/1990; Trevathan & Rosenberg, 2016). As a result of experiencing the external world at a time when their brains are going through a rapid period of change, human infants develop very differently than they would if they remained in the protective warmth of their mothers’ wombs, causing some theorists to argue that this extended period of brain development is responsible for the extraordinary features of Homo sapiens’ technical and social skills. Imagine, German zoologist Adolf Portmann (1944/1990) asks us, “the developing human spending the important maturation period of its first year in the dark, moist, uniform warmth of its mother’s womb… It will gradually become clear that world-open behavior of the mature form is directly related to early contact with the richness of the world, an opportunity available only to humans” (p. 93).

Although 1-year-old infants are hardly mental giants, some early-acquired developmental milestones serve as the basis of later essential social-cognitive accomplishments. For example, starting around 9-months of age, infants begin to display shared attention (e.g., mother and infant sharing attention about a third object), begin to see others as intentional agents, and are able to take the perspective of other people. In his shared intentionality theory, Tomasello (2019) refers to these early-developing abilities as joint intentionality, defined as “the cognitive capacity to create a joint agent ‘we’ with other individuals, creating the possibility of taking the perspective of others” (p. 305). The abilities underlying joint intentionality, particularly viewing others as intentional agents, are the basis for all more sophisticated social-cognitive accomplishments, and they would likely not have had the opportunity to evolve were infants born following the typical primate schedule.

Tomasello (2019) argues that, “if we wish to explain how uniquely human psychology is created, we must focus our attention on ontogeny, and especially on how great ape ontogeny in general has been transformed into human ontogeny in particular” (p. 6). One important way that great ape and human ontogeny differ is in the rate of brain development, with humans’ extension of the rapid prenatal rate of brain growth altering the very nature of infancy and the social-cognitive accomplishments of the species. Consistent with the social brain hypothesis that posits that changes in social cognition may be especially important in human psychological evolution is evidence that 2-year-old chimpanzees and human children display comparable levels of physical cognition (e.g., tasks dealing with space, quantities, and tools), although 2-year-old children out-perform chimps on tasks of social cognition (e.g., imitation, nonverbal communication, and reading the intentions of others). Moreover, whereas children continue to improve on both physical- and social-cognitive tasks for the next few years, chimpanzees do not, essentially attaining adult levels by 2 years of age (Wobber et al., 2014).

Neoteny apparently not only played a role in extending the rate of brain development but also affected the development of individual neurons and neuronal plasticity. Synaptogenesis , the process of forming new synapses, is responsible in large part for human behavioral and cognitive plasticity, and such plasticity is greatest early in life. Neuronal metabolism and synaptic activity peak later in humans than in other primates (see, e.g., Bufill et al., 2011; Liu et al., 2012; Petanjek et al., 2011; Somel et al., 2009), as does the process of myelination (Miller et al., 2012), thus extending neural plasticity into adulthood. Humans and chimpanzees possess similar genes associated with synapse formation in the cerebral cortex, with the expression of these genes peaking earlier in chimpanzees (before 1 year) than in humans (about 5 years) (Liu et al., 2012). Also, levels of gene expression associated with cortical synaptogenesis are similar in adolescent and adult humans to that observed in juvenile chimpanzees (see Bufill et al., 2011; Somel et al., 2009), causing Bufill et al. (2011, p. 735) to state that “human neurons belonging to particular association areas retain juvenile characteristic throughout adulthood, which suggests that a neuronal neoteny has occurred in H. sapiens, which allows the human brain to function, to a certain degree, like a juvenile brain during adult life… Neuronal neoteny contributes to increasing information storage and processing capacity throughout life, which is why it was selected during primate evolution and, to a much greater extent, during the evolution of the genus Homo.”

Human brains are substantially larger than those of the great apes and those of their hominin antecedents. Although there is no single cause for this brain expansion, much of it was due to variation of growth rates relative to Homo sapiens’ ancestors, some due to accelerations and others due to retention of rapid prenatal growth rates into infancy, as well as retention of plasticity of individual neurons well into adulthood. Infants’ brains grow rapidly at a time when, if they followed a typical primate schedule, they would be tucked securely within their mothers’ wombs. Instead, because of their “early” birth, they experience a world of lights, sounds, smells, social others, and artifacts, which changes the nature of their cognition, and, in many ways, were responsible for the evolution of the modern human mind.

5 Cognitive Neoteny: The Benefit of Neural and Cognitive Inefficiency

It is hard to underestimate the impact on the evolution of the human mind of having lots of neurons that retain their plasticity well past infancy. However, the brains of human infants are still immature and inefficient compared to those of older children and adults, with synaptogenesis not peaking until childhood and many areas of the brain, particularly the prefrontal cortex, having little or no myelin. Thus, infants and young children process information more slowly than older children and adults, such that their processing is effortful in that it uses substantial portions of their limited mental resources (Hasher & Zacks, 1979). In contrast, the cognitive processing of older children and adults is more apt to be automatic, in that it requires little or none of one’s limited capacity. Despite the obvious disadvantages to infants’ inefficient neural processing, there are some benefits. According to Bjorklund and Green (1992, pp. 49–50):

Because little in the way of cognitive processing can be automatized early, presumably because of children’s incomplete myelination, they are better prepared to adapt, cognitively, to later environments. If experiences early in a life yielded automatization, the child would lose the flexibility necessary for adult life. Processes automatized in response to the demands of early childhood may be useless and likely detrimental for coping with the very different cognitive demands faced by adults. Cognitive flexibility in the species is maintained by an immature nervous system that gradually permits the automatization of more mental operations, increasing the likelihood that lessons learned as a young child will not interfere with the qualitatively different tasks required of the adult.

5.1 The Adaptive Value of Neural Inefficiency

Perhaps the greatest benefit of neural inefficiency in early development is in terms of plasticity, the ability to change. As we saw in the previous section, human neurons retain their ability to change well into adulthood (neuronal neoteny). However, this does not mean that plasticity is infinite. Rather, experiences early in life result in strengthening connections among some sets of neurons and weakening or eliminating connections among others. The end result is a reduction of plasticity. However, because neuronal processing is relatively inefficient during the first 2 years of life, high levels of plasticity are retained. Because of this, the effects of a deleterious early environment can be reversed should circumstances change.

Such plasticity has been repeatedly demonstrated in nonhuman animals (e.g., Suomi & Harlow, 1972) and human children (e.g., Beckett et al., 2010; Nelson et al., 2007; Troller-Renferee et al., 2018). For example, children who spend their early months in understaffed, neglectful institutions display signs of neurological, social, and intellectual deficits that tend to persist into adolescence (e.g., Beckett et al., 2010; Mackes et al., 2020; Merz et al., 2016). Such psychological effects are mirrored by differences in brain functioning. According to developmental neuroscientist Charles Nelson (2007), “many forms of institutional rearing lack most elements of a mental-health-promoting environment. As a result, the young nervous system, which actively awaits and seeks out environmental input, is robbed of such input … institutionalization appears to lead to a reduction in cortical brain activity … and to dysregulation of neuroendocrine systems that mediate social behavior” (p. 16). However, if children are removed from such institutions before 18–24 months, they often display substantial recovery of social, emotional, and intellectual functioning. In contrast, children who remain in institutions much past their second birthdays are less apt to show recovery of typical psychological functioning (e.g., Beckett et al., 2010; Merz & McCall, 2010; Nelson et al., 2007).

The results of institutionalization studies clearly show that children can rebound from the negative effects of social deprivation if they experience a supportive environment beginning around 18 or 24 months of age. Some researchers have proposed that human development is highly canalized during the first 18 or 24 months of life, meaning that children follow the species-typical path “under a wide range of diverse environments and exhibit strong self-righting tendencies following exposure to severely atypical environments” (McCall, 1981, p. 5). Although infants may be negatively affected by early neglectful environments, there is a tendency to return to a course of normalcy when they experience more supportive conditions. As infants’ brains mature, the degree of plasticity reduces, making it more difficult to reverse the effects of a maladaptive environment. Eighteen to 24 months also corresponds to a time when the rate of brain growth begins to slow (Leigh, 2004; Matsuzawa, 2001) and when children’s cognitive abilities undergo substantial changes (e.g., the onset of language and the transition from Piaget’s sensorimotor to preoperational periods), further suggesting that maturational-based changes in brain development and organization are responsible, in part, for the reduction in plasticity at this time.

5.2 The Adaptive Value of Poor Memory

At the core of cognition is memory, and there is an extensive literature demonstrating age-related differences in memory from infancy through adulthood (see, e.g., Bauer & Fivush, 2014). Young infants demonstrate memory mainly through perceptual phenomena (e.g., showing dishabituation to novel stimuli) or conditioning. For example, Rovee-Collier and her colleagues developed the conjugate reinforcement procedure, in which a ribbon is tied to an infant’s ankle and then to a mobile over the infant’s head while in a crib. Infants learn that their leg movements cause the mobile to move, and demonstrate this by kicking their legs when, after initially learning the connection between kicking and the movement of the mobile, they later kick their leg when placed in the crib when the ribbon is not tied to the mobile (see Rovee-Collier & Cuevas, 2009). In one set of experiments, Rovee-Collier and her colleagues (1992) showed that 2-month-old infants remembered the connection between kicking and the movement of the mobile but only when the crib liner was the same during both acquisition and testing. Rovee-Collier and Shyi (1992) proposed that this reflects an extreme dependency on context for young infants that may prevent them from retrieving memories in “inappropriate” situations. Given infants’ poor inhibitory abilities (e.g., Baird et al., 2002; Diamond, 1985; Holmboe et al., 2008), such dependency on context may prevent infants from retrieving previously learned memories (actions) in inappropriate situations. As infants gain more experience with their physical and social worlds over the course of the first year, they become less dependent on the context in remembering (Learmonth et al., 2004; Rovee-Collier & Cuevas, 2009), such that experiences in one context can be usefully applied in similar contexts. According to Hartshorn and her colleagues (2004, p. 76), “As the physical world of the developing infant progressively expands and the infant’s niche also changes, the behavioral solutions to problems that characterized the relatively static habitat and niche of the younger infant must also change or lose their adaptive utility.”

As children acquire language, they begin to recall information according to scripts, schematic organizations with real-world events organized in terms of their causal and temporal characteristics (Nelson, 1996). Children learn what usually happens in a situation (e.g., eating breakfast in the morning) and remember novel information in the context of these familiar events (see Bauer, 2007; Fivush et al., 1992). However, 2-year-old children often show an over-reliance on scripts, remembering only script-consistent facts and failing to remember novel experiences (e.g., Fivush & Hamond, 1990). Rather than being maladaptive, Nelson (1996, 2005) suggested that young children’s reliance on scripts helps them predict the probability of events in the future. According to Nelson (1996, p. 174):

Memory for a single, one-time occurrence of some event, if the event were not traumatic or life-threatening, would not be especially useful, given its low probability. Thus, a memory system might be optimally designed to retain information about frequent and recurrent events—and to discard information about unrepeated events—and to integrate new information about variations in recurrent events into a general knowledge system.

Similarly, Rovee-Collier and Giles (2010) argued that infants’ generally poor long-term memory reflects “rapid forgetting… an evolutionarily selected survival-related strategy that facilitates young infants’ adaptation to their rapidly changing niche and enables them to shed the excessive number of recent, rapidly formed associations that are potentially useless, irrelevant, or inappropriate” (p. 203) (cf., Bjorklund & Green, 1992). Rovee-Collier proposed that the first 9- or 10-months of life is a time of “exuberant learning” accompanied by rapid synaptogenesis and pruning.

There has also been some speculation that young children’s limited working memories may facilitate the initial acquisition of language. For example, some researchers have shown that the different sensory systems develop at different times, coordinated with sensory experiences, so that the development of one sensory system does not interfere with the development of other sensory systems (Turkewitz & Kenny, 1982). Newport (1991) made an analogous argument for the early stages of language acquisition, proposing that young children’s limited working memories simplify the body of language they process, which makes the complicated syntactical system of any human language easier to learn. Newport developed a computer simulation that varied how much the computer program could keep in memory at any one time, equivalent to varying the size of a child’s short-term store. She reported that restricting the computer program’s memory resulted in early deficits in language learning (for instance, whole words were often lost), but that word endings that denote verb tense and plurals were more likely to be retained. Newport concluded, “overall, then, a learning mechanism with a restricted input filter [smaller short-term memory] more successfully acquired a morphology [syntactic structure]; the same learning mechanism with a less restricted filter [larger short-term memory], or with no filter at all, entertains too many alternative analyses and cannot uniquely determine which is the better one” (p. 127). (See also Elman [1994], who reached a similar conclusion using a very different type of computer simulation.) Experimental support for Newport’s and Elman’s hypotheses comes from research demonstrating that adults learn an artificial grammar faster when presented with smaller rather than larger units of the language (Kersten & Earles, 2001).

The research examined in this section may not reflect neoteny in a literal sense. Inefficient neural processing or limited memory abilities do not reflect features of juvenile ancestors that have been retained in modern adults. They do reflect the consequences of prolonging neural maturation, and they are examples of what are usually thought of as immature or poor cognition that actually may have an adaptive value for infants and children at a particular time in development. Natural selection has made use of infants’ immaturity to help them develop into children and later adults who can function well in their communities (Bjorklund & Green, 1992).

6 Conclusion

Our ancient ancestors evolved to become the species we are today. But each of our ancestors also developed, and the forces of natural selection operated as potently, if not more so, on the early stages of development as on later stages. Modifications of development, including changes in rate and timing of developmental milestones (heterochrony), had an enormous impact on the evolution of many species, with humans being no exception. Although there is ample evidence of both heterochronic acceleration and retardation in Homo sapiens , neoteny can be seen as the source of many of our species’ unique features. This may be most clearly seen in infancy, from birth to weaning, with neotenic changes in infants’ physical characteristics, rate of brain growth, and enhanced neural plasticity fostering babies’ survival and transforming the nature of their cognitive and social functioning to serve as the foundation for the modern human mind.