The modern human pattern of growth and development evolved over the course of several million years and is distinct in several respects from that of living apes (Bogin 1988, 1997, 2006; Bogin and Smith 1996; Robson et al. 2006; Schultz 1960; Watts 1986). In general, nonhuman primates are considered to have four major life-history stages: infancy, the juvenile period, a short adolescent period, and the adult stage (see discussion below). However, modern humans have “inserted” an additional stage: early childhood, which falls before the juvenile period, and they have greatly elongated their adolescent stage (Bogin 1999, 2006) (Fig. 1). This means that sexual and physical maturation is substantially delayed to the last stage of the growth period—adolescence—and the whole growth period is extended, taking almost twice as long as that seen in extant apes (Bogin 1999; Schultz 1960).

Fig. 1
figure 1

Length of life history stages in apes and modern humans. Note that apes lack an early childhood stage. (modified from Bogin 2003)

The focus of this special issue is on the middle childhood stage of growth and development (a term used by developmental psychologists, who recognize early, middle, and late childhood). In broad comparative ethological terms, the middle childhood stage starts after brain growth is complete and during which individuals are not completely dependent on their parents but have not yet reached puberty (Bogin and Smith 1996; Leigh 2004; Pereira and Altmann 1985). In the comparative, primatological literature this stage is generally called the juvenile stage (as used in the previous paragraph) (e.g., Bogin 2003). Higher primates, particularly apes, are differentiated from other mammals by having a prolonged juvenile period, which is thought to have evolved by means of natural selection, allowing more time for social learning and cognitive development (Bogin 2006; Kaplan and Robson 2002; Kaplan et al. 2000; Pagel and Harvey 1993; Pereira and Altmann 1985). Thus, apes and modern humans share the trait of a long juvenile period, but the addition of the early childhood stage and the elongation of the adolescent stage of the program of human growth and development have changed how the middle childhood period relates to the infancy and adult periods. The goal of this paper is to explore the evolution of the modern human pattern of growth and development, with a focus on the period of middle childhood in Homo erectus and more recent hominins. A secondary goal of this paper is to review how closely the overall pattern of Neandertal growth and development, including the early childhood and adolescent stages, matches that of modern humans to assess when the modern pattern of growth and development evolved.

Definitions of the Stages of Growth and Development

As outlined above, the pattern of growth and development in modern humans differs from that of higher primates principally by possessing an early childhood stage and an elongated adolescent stage, resulting in a longer period of preadult growth, combined with high rates of postnatal brain growth (Bogin 1999). Before we can explore the evolution of these stages in detail, we must clearly define what the stages are. As Dean and Smith (2009:102) note, mammalian life stages can be defined in a number of different ways. We choose to follow Bogin (2006:205) in dividing human life history into five stages because they are “biologically definable and meaningful” and because “each stage encompasses a set of biological and behavioral traits that define” them. Such an approach does not ignore life history events like age at maturation, birth rates, interbirth intervals, and age at death (Leigh and Blomquist 2007). Furthermore, Bogin’s stages can be cross-referenced to skeletal/dental landmarks, thereby providing a starting point for organizing and analyzing the immature fossil record that consists of bones and teeth. Finally, it also allows us to examine the nature of the juvenile stage (or middle childhood) by assessing those stages that precede and follow it and to explore evolutionary explanations for any differences (e.g., embodied capital model, grandmother hypothesis; Hawkes and Paine 2006). That being said, it is still extremely challenging to develop precise descriptions of the different stages of growth and development in modern humans because different scholars use different “landmarks,” leading to differing opinions on when the different stages start and stop. The task of developing precise definitions of each stage of growth is made even more complicated when one attempts to map these stages onto our closest primate relatives, the African apes, in order to develop models of how the patterns of growth and development may have evolved from the common ancestor we share with the chimpanzee and gorilla. However, such definitions are a critical starting point prior to assessing the fossil record. The objective of this section is to develop a consensus stage model to form the basis for the consideration of the evolutionary changes that occurred over the past several million years.

Infancy

in all mammals is described as the postnatal period during which the offspring is wholly dependent on the parents, usually the lactating mother, for food and protection (Bogin 1999). In modern humans, a neonatal period lasting for the first 28 days of life is sometimes identified, and rapid prenatal brain growth rates are maintained for the first year of life (Dienske 1986; Rosenberg 1992; Smith and Tompkins 1995). The extension of fetal brain growth rates allows modern human females to give birth to infants with relatively small brains (compared with the adult outcome) as an adaptation to locomotor and gestation length constraints, and the high postnatal growth rates enable us to develop our large brain (Leigh 2004; but see Robson et al. 2006).

The key trait marking the end of the infancy period for both apes and humans is weaning. In apes weaning takes place at about 4 years of age (Robson and Wood 2008) and in humans (worldwide average) at about 3 years of age (Bogin 2003; Robson and Wood 2008). In apes, weaning coincides with the achievement of 90% of brain growth and with the eruption of the first permanent molar, although these events are further delayed in modern humans (Bogin 2003; Robson and Wood 2008).

Early Childhood

Following infancy, modern humans experience the childhood stage (called “early childhood” here to differentiate it from middle childhood) not seen in apes. During early childhood, high rates of brain growth are maintained but only moderate somatic growth occurs (Bogin 2003, 2006). During this stage, the human child is dependent on its parents for extensive food supplementation not only to support the high rate of brain growth but because the digestive system and dentition are not yet mature enough to process adult foods (Bogin 1999, 2006; Leonard and Robertson 1994, 1997; Sellen 2006).

Middle Childhood

The end of the early childhood phase, and hence the beginning of the juvenile, or middle childhood, stage, is the subject of some debate. One marker often used is the cessation of brain growth (in weight), which, Bogin (2003) has argued, occurs by about 7 years of age. However, while this age certainly marks the attainment of the majority of brain growth, growth actually continues slowly into the teenage years (Robson and Wood 2008). Hence, some scholars argue that a better marker is the achievement of 90% of adult brain growth, which occurs by about 5 years of age (Robson and Wood 2008). Another marker is the eruption of the first permanent molar (which coincides with the end of infancy and the beginning of the juvenile stage in apes), which occurs between 4.7 and 7 years of age in humans (Robson and Wood 2008). Finally, much of the anthropological/psychological literature cites the ages of 5 or 6 as the beginning of the middle childhood stage (e.g., White 1996). Thus, a range of estimates for the beginning of middle childhood would be from 5 to 7 years of age. Given the lack of consistency in these markers, using a compromise estimate of about 6 years to mark the beginning of the middle childhood/juvenile stage is a useful starting point for assessing fossil material.

The juvenile, or middle childhood, stage starts directly after weaning in apes, and after the early childhood stage in modern humans. By the beginning of the middle childhood stage, in both apes and modern humans, brain growth is mostly complete, the first molars have erupted, and individuals are not completely dependent on their parents but they have not yet reached puberty (Bogin and Smith 1996; Leigh 2004; Pereira and Altmann 1985). Higher primates, particularly apes and modern humans, are differentiated from other mammals by having a prolonged juvenile period, which is thought to have evolved by means of natural selection, allowing more time for social learning and cognitive development (Bogin 2006; Campbell 2006; Kaplan and Robson 2002; Kaplan et al. 2000, 2003; Pagel and Harvey 1993; Pereira and Altmann 1985; but see Blurton Jones 2006). In modern humans the middle childhood stage can be described as the “entry into the age of reasoning” (White 1996:17), during which important cognitive developments occur and new behaviors emerge (Hochberg 2008; White 1996).

The end of the juvenile period is generally considered to be the onset of sexual maturation, which is most obviously marked by the onset of menarche in females (e.g., Bogin 2003; Pereira and Altmann 1985). Sexual maturation occurs at around 11 or 12 years of age in modern humans (Bogin 2003) and between the ages of 7 and 11 for the different species of African apes (Leigh 1996; Strier 2007). Thus, the middle childhood/juvenile period lasts between 5 and 6 years in modern humans, and between 3 and 7 years for the African apes, with a midpoint of about 5 years.

Adolescence

One of several definitions of the adolescent stage that is commonly used is the period of life beginning with sexual maturity and ending with the cessation of somatic growth and the attainment of adult size (e.g., Watts 1985). Bogin’s (1999, 2003) definition explicitly includes a growth spurt. Although African apes do demonstrate a growth spurt in body mass (e.g., Leigh 1996), Bogin appears to feel that this body mass spurt is insufficient to identify the adolescent stage in our primate relatives (see Bogin 2003: Figure 2.7). However, there is definitely a delay between menarche and the cessation of somatic growth in African apes, leading other researchers to argue that the adolescent stage should be recognized (e.g., Leigh 1996; Pereira and Altmann 1985; Watts 1985). In this paper we will follow the consensus opinion that the adolescent stage should be recognized in African apes, although we agree that adolescence in modern humans is distinct because it is delayed and of longer duration than in the great apes (Bogin 1988; Smith 1993; Tanner 1990; Watts 1985) and involves a consequent delay in reproductive maturity. The adolescent stage is also notable in modern humans because they also undergo a very pronounced and delayed growth spurt (Leigh 1996:455). The adolescent growth spurt is apparently unique to humans in that it involves a rapid increase in the rate of growth soon after the eruption of the second molar, resulting in an increase in linear dimensions of most of the skeleton, including the face (Bogin 2003). Although both human and nonhuman primates demonstrate some sort of acceleration in growth velocity during adolescence (Laird 1967; Leigh 1996), this “long absolute delay in the initiation of the human growth spurt” (Leigh 1996:455) as well as the magnitude and late timing of the human growth spurt (and thus the nature of adolescence) appear to be unique to modern humans (Bogin 2003). Somatic growth in African apes terminates at between 10 and 12 years of age, making their adolescence approximately 2 years long. In modern humans, adolescence begins about 10–12 years of age and ends at about age 18–20, lasting about 8 years (Robson and Wood 2008; see also Rosenfeld and Nicodemus 2003).

Adulthood

is the period between the cessation of growth and attainment of reproductive maturity and the death of the individual. African apes are known to have a maximum lifespan of between 50 and 54 years, whereas the maximum life span of nonindustrialized groups of modern humans is about 85 years (Robson and Wood 2008).

This paper will review the available fossil evidence to assess when our modern pattern of growth evolved. It will focus on the genus Homo (erectus, neanderthalensis, sapiens). Studies of Plio-Pleistocene hominins (the australopithecines and habilines) have demonstrated that early hominins followed an apelike rate and pattern of growth and development, with the implication being that later hominins became more similar to modern humans over the course of evolution (Anemone et al. 1996; Beynon and Dean 1988, 1991; Bromage and Dean 1985; Dean 1989; Dean et al. 2001; Moggi-Cecchi 2001; Smith 1986, 1989, 1991, 1992). Thus, H. habilis and rudolfensis will be excluded since their dental development and body proportions are more australopithecine- than humanlike (Ruff 2009; Skinner and Wood 2006; Wood and Collard 1999). In fact, Wood and Collard (1999) have argued that “the habilines” should be completely excluded from the genus Homo.

Recognizing Life History Stages in the Fossil Record

Although scholars who study stages of growth and development in modern humans and other living organisms can readily observe physical and behavioral traits and easily assess the age of their study subjects, evolutionary biologists must identify anatomical correlates that can be used to identify the stages and rely on dental/skeletal criteria in order to determine the developmental age of a fossil.

Infancy

The age of weaning is particularly difficult to establish in the fossil record. However, the increasing use of isotopic analysis, particularly of δN values, holds some promise in this regard (e.g., Bocherens et al. 2001). Allometry can be used to make predictions for weaning age on the basis of its association with body size (Harvey and Clutton-Brock 1985), and patterns of lesions generally associated with systemic stress that accompanies weaning (such as linear enamel hypoplastic lesions) can be examined on the skeleton (Dean and Smith 2009; Katzenberg et al. 1996).

Early Childhood

The pause in somatic growth experienced in early childhood may be indicated by alterations in the pattern of dental development (Mann et al. 1996), while an increase in the final size of the adult brain is likely to be the product of high rates of brain growth that occur during this period (Bogin 2003) and the absolute extension of the time for brain growth (ca. 4 years in apes and ca. 6 years in modern humans). If it is possible to determine the age of weaning, the continuation of brain growth after the age of weaning would also be evidence for the early childhood stage.

Middle Childhood

is somewhat difficult to accurately identify in the fossil record. Apart from the eruption of M1, which marks the beginning of the middle childhood/juvenile stage, perhaps the best method is by exclusion: individuals in this stage will not demonstrate evidence of the brain growth that is characteristic of infancy and early childhood, nor will they show evidence of the adolescent growth spurt. It is possible that they will demonstrate evidence of stress and/or pathological/traumatic lesions that are associated with the initial adoption of adultlike behaviors.

Adolescence

The key feature of adolescence, the growth spurt, is recognized on the skeleton in a variety of ways. One is the plotting of long bone growth on dental age for samples of individuals (Nelson and Thompson 2005). Another is the relationship between bone size and epiphyseal fusion (Smith 1993) and the proportional relationships among different skeletal elements (Anton and Leigh 2003).

Adulthood

The cessation of growth and shift into the adult stage is recognized skeletally by the movement of the third molars into occlusion and the closure of the epiphyses of the major long bones (Buikstra and Ubelaker 1994).

An important aspect of the analysis of any hominin fossil is the determination of its age at death. However, age determination is not necessarily a straightforward exercise. Standards exist for dental eruption and skeletal maturation in modern humans and in African apes, but we must be very careful in the simple application of these standards to fossil forms. Furthermore, ever more sophisticated aging techniques are being developed, forcing us to reconsider earlier hypotheses.

Adaptive factors that changed during the course of human evolution and that would have had profound effects on patterns of growth and development include (1) body size (see Ruff et al. 1997), (2) brain size (Ruff et al. 1997), (3) evolutionary cost/benefit trade-offs associated with delaying or extending the growth period (Bock and Sellen 2002; Brumbach et al. 2009; Hawkes and Paine 2006; Leigh 2001), (4) dietary requirements (Gurven and Walker 2006), (5) increased emphasis on time available for learning complex tasks (Bogin 2003; Kaplan et al. 2003), and (6) mortality rates (Migliano et al. 2007; Stearns and Koella 1986; Walker et al. 2006).

The Fossil Record

Homo erectus

The origin of H. erectus heralded the appearance of an adult morphology which was much more reminiscent of modern humans than any previous hominin. H. erectus was very close to, and possibly larger than, modern humans in terms of body size (Kappelman 1996; Ruff et al. 1997) and stature (McHenry 1991). The upper end of the range of cranial capacity recorded for this taxon (691 to 1,231 ml; Aiello and Dean 1990; Spoor et al. 2007) reaches into the lower end of the range of variation for modern humans (Aiello and Dean 1990:193). Overall limb proportions appear to lie within the modern human range of variation (McHenry 1978; Ruff 2003, 2008; Walker and Leakey 1993).

It is difficult to ascertain much about the infancy of H. erectus. Analysis of the 1.8-million-year-old Mojokerto child (see Table 1 for a listing of the subadult fossils discussed in the text), who died at about 1 year of age, reveals that its cranial capacity was 72–84% of that of an average adult H. erectus (Coqueugniot et al. 2004; but see Leigh 2006 for further discussion). This evidence suggests a pattern of relative brain growth resembling that of living apes rather than modern humans (Coqueugniot et al. 2004). A recent discovery of a 1.2-million-year-old adult female pelvis from Gona, Ethiopia, indicates that H. erectus females’ hips were wider than those of modern human females, suggesting that H. erectus infants were born with heads about 30% larger than previously calculated (Walker and Ruff 1993). Simpson et al. (2008:1090) concluded that such infants would have experienced a rate of postnatal brain growth that was intermediate between modern humans and chimpanzees. However, since the Gona pelvis may belong to Paranthropus boisei (Ruff 2010), these results must be interpreted with caution. Also, several authors (Leigh 2004; Robson et al. 2006) challenge the idea that human brain growth in infancy is distinct from that of apes.

Table 1 Important subadult fossil specimens cited in this study

Enamel defects on the canine teeth of the Nariokotome boy occurred when he was as young as 3.3 or as old as 4.2 years of age (depending on aging method) and may coincide with weaning (Dean and Smith 2009). If so, this may mean a longer infancy in H. erectus, falling between that seen in chimpanzees (4.5 years) and humans (2.8 years) (Robson et al. 2006; see also Skinner 1997).

Bogin (2006) argues that childhood evolved by about two million years ago in some early member of the genus Homo. We would argue that this is unlikely, given the australopithecine-like rates of dental growth mentioned previously. However, if the weaning age of H. erectus was intermediate between that of modern humans and chimpanzees, H. erectus may have experienced a short childhood stage, perhaps lasting only 1 year, as suggested by Bogin (2006).

There is debate regarding the presence of an adolescent growth spurt and hence an adolescent period in H. erectus. Smith (1993) suggested that H. erectus was more humanlike in its growth and development than earlier hominins, but that it had not yet achieved the modern human rate and pattern of growth; the size and maturation of the postcranial skeleton was too advanced to have followed a modern human growth pattern. In other words, H. erectus “had not yet evolved the distinctive human pattern of suppressing growth in late childhood and subsequently ‘catching up’ through the human adolescent growth spurt” (Smith 1993:218). Tardieu (1998:163) reached a similar conclusion based on the fact that the “growth spurt in linear dimensions of the femur, characteristic of human adolescence, is shown to be associated with qualitative changes of the distal femoral epiphysis engendered by the late closure of the distal epiphysis.” In other words, particular morphological changes to the distal femur take place during the adolescent growth spurt in modern humans. Since the same morphology was found in the Nariokotome boy, she inferred “that a growth spurt had begun with Homo erectus but was probably less pronounced and of shorter duration than in modern humans” (Tardieu 1998:163).

In her analysis, Tardieu estimated that the Nariokotome boy was 15 years old at death based on achieved stature and that final fusion of his epiphyses would have taken perhaps three more years. Smith’s (1993) work was based on a dental age estimate for the Nariokotome boy of 12 years, based on modern human formation times. Since those two studies, Dean et al. (2001:629) have asserted that none of the trajectories of enamel growth in apes, australopiths, or fossils attributed to H. habilis, H. rudolfensis, or H. erectus fall within their sample of modern humans and that dental development was faster in H. erectus than in modern humans. They presented a new age estimate for the Nariokotome boy based on dental formation times of 8 years (Dean et al. 2001). Since then, the dental age estimate has been further explored by Dean and Smith (2009), who give a range of dental age estimates from 7.6 to 8.8 years, with an average of 8.2 years for this individual.

In light of the new age estimate for the Nariokotome boy, several new conclusions can be reached (see Dean and Smith 2009 for a detailed assessment of the skeletal age of WT-15000). The process of epiphyseal fusion in the femur takes about 3 years (Tardieu 1998). The process of fusion of the proximal femur ceases at about 16–18 years of age in modern humans and at 11–12 years in chimpanzees (Tanner 1990; Watts 1985). Thus, this 8-year-old boy would likely have completed femoral growth by the time he reached 11–12 years of age, as do extant apes.

In a study by Thompson and Nelson (2000), proportional femur length (expressed as a percentage of average adult length) was plotted against dental age using recent modern human and ape data for comparison, leading to the conclusion that the H. erectus boy was following an apelike trajectory of both dental and postcranial growth. Our conclusion differs from that proposed by Dean and Smith (2009), who infer a longer duration of growth based on dental development, with M3 erupting ca. 14.5 years of age. However, given the Nariokotome youth’s new dental age at death, and the fact that his femur is very close to the mean femoral length for available H. erectus adults, a shorter duration of growth is more likely. The age estimate used in our 2000 study was, like Smith’s (1993), 12 years. In Fig. 2, we have replotted the figure with the Dean and Smith (2009) age of 8 years. The new position for this H. erectus youth is still consistent with an apelike growth trajectory and is even further from the trajectory of modern humans. Based on both dental and skeletal age (based on proportional growth) we conclude that the duration of growth in H. erectus had not yet been extended beyond what we see in living apes.

Fig. 2
figure 2

Proportional femoral length versus age for male and female modern humans and gorillas, Neandertals, and early modern Homo sapiens. The position of WT-15000 is indicated by a diamond. (modified from Thompson and Nelson 2000)

Despite the fact the WT-15000 femur was near the end of its growth trajectory, Anton (2002) and Anton and Leigh (2003) have maintained that his face still had some growth remaining to achieve adult size and explored the idea of a growth spurt in craniofacial dimensions. Note that by modern human age standards, this 8-year-old H. erectus youth would be in the middle of the middle childhood stage. However, based on estimates of cranial capacity presented by Begun and Walker (1993), the Nariokotome individual had achieved about 97% of his adult brain size, which is more consistent with a modern human entering into adolescence. Thus, it is very possible that while H. erectus may have experienced spurts in growth like those seen in other primates (Leigh 2004), they were not delayed, as they are in modern humans, and it is possible that different parts of the H. erectus skeleton (e.g., femur vs. face vs. skull) experienced somewhat different patterns of growth as opposed to the near-global skeletal growth spurt seen in modern humans (Bogin 2003). Thus, it is likely H. erectus grew up following a more apelike than human like pattern and duration of growth.

The probable insertion of an early childhood phase of growth suggests that the onset of middle childhood would have been somewhat delayed in these individuals, although the lack of a humanlike adolescent growth spurt suggests that the middle childhood stage may have ended at about the same age as in the African apes. Thus, while the total duration of growth (infancy + early + middle childhood stages) in H. erectus was approximately the same as in the African apes, the appearance of the early childhood growth phase would have led to a shortening of the middle childhood phase and/or infancy. Although there is no evidence of a growth spurt in H. erectus, the rate of growth must have been quite high, as the final adult body size achieved by the early teens is comparable to that of modern human adults.

Early Archaic Homo

Archaic H. sapiens is characterized by a combination of primitive traits such as robust cranial superstructures and modern traits such as a large cranial capacity (Klein 1989; Trinkaus 1982). Included in this broad category are specimens attributed to H. antecessor (Bermúdez de Castro et al. 1997) and H. heidelbergensis (Schoetensack 1908). Much of what we know about growth in these two species comes from two sites in Spain. One, Gran Dolina, dates to more than 780,000 years ago and contains specimens attributed to H. antecessor (Arsuaga et al. 1999; Bermúdez de Castro et al. 1997; Falguéres et al. 1999). The other, Sima de los Huesos, dates to at least 300,000 years ago, and specimens from this site have been attributed to H. heidelbergensis (Arsuaga et al. 1997a, b).

Bermúdez de Castro et al. (1999) assessed patterns of dental development in Hominins 1, 2, and 3 from the TD6 level of the Gran Dolina site using modern human and ape dental development standards. They compared relative dental development of I2/M1, C/M2, and M3 and found that the pattern of development was more modern than apelike. Therefore, they concluded that H. antecessor followed a humanlike pattern of dental growth. According to Bermúdez de Castro et al., if one assumes that a modern dental developmental pattern is linked to the duration of growth, “the H. antecessor species would be characterized by a prolonged maturation period presumably similar to that of modern populations. Therefore, the evolution of a modern pattern of development would have taken place in the period between the late Pliocene and the late Lower Pleistocene. In this interval, the childhood and adolescence periods (with its characteristic growth spurt) would have emerged” (Bermúdez de Castro et al. 2003:262).

Similarly, Bermúdez de Castro and Rosas (2001) and Bermúdez de Castro et al. (2003) found the relative dental development of Hominin 18 from the site of Sima de los Huesos to be more modern than apelike. Using counts of surface perikymata (dental microstructures), Bermúdez de Castro et al. (2003) examined the anterior teeth attributed to H. antecessor. With the exception of the lower canine and upper central incisors, which had shorter crown formation times, the formation times of the other anterior teeth fell within the modern human range. These preliminary analyses of the H. antecessor and H. heidelbergensis specimens all pointed to a delayed dental development closer to that of modern humans and Neanderthals. However, in a more recent analysis, Ramirez-Rozzi and Bermúdez de Castro (2004) suggested that H. antecessor and H. heidelbergensis had shorter periods of dental growth than modern humans based on the rate of enamel formation.

When these analyses are being considered, it is important to note (following Dean et al. 2001) that even if patterns of dental development appear modern, that does not mean that rates of dental growth were humanlike. Furthermore, the existence (or lack thereof) of somatic growth spurts cannot be detected on the basis of dental remains alone (e.g., Thompson and Nelson 2000). Therefore, the pattern and duration of somatic growth in Early Archaic Homo remains unknown, and more data are needed to test Bermúdez de Castro et al.’s (2003) hypothesis.

Neandertals

Neandertals are generally described as short to medium-statured and barrel-chested, possessing large muscle attachments, large joint surfaces and robust shafts on all their bones, and antero-posterior curvature of the femur (e.g., Stringer and Gamble 1993; Trinkaus 1983). The mean body and brain sizes for Neandertals (76 kg and 1,498 ml, respectively; Ruff et al. 1997) exceed those of modern humans (data in Aiello and Dean 1990). Some of their distinctive characteristics include distal limb shortening and large noses, traits that have also been interpreted as adaptations to life during the most recent glacial period (Trinkaus 1981). In contrast, early modern humans were taller and had no distal limb shortening, a smaller face and nasal aperture, and a higher, more rounded cranial vault (Aiello and Dean 1990; Stringer and Gamble 1993). This contrast in morphology and the late arrival of Upper Paleolithic H. sapiens into Europe, coupled with the apparent disappearance of the distinct Neandertal morphotype, have been used to argue that these two hominins were separate species with differing patterns of growth and development. Much research has focused on rates of dental development to assess this claim. However, postcranial growth research can also make a contribution to this debate. Fortunately, we have a very large sample of Neandertal cranial, dental, and postcranial fossils to work with.

Dental Evidence

In recent years the answer to the question of when the modern human pattern of growth and development appeared has been sought through the examination of dental development in Neandertals as compared with modern humans. Guatelli-Steinberg (2009) provides an eloquent summary of this research, so only highlights will be provided here. Researchers have examined both patterns of dental eruption and rates of tooth formation. Work by Ramirez-Rozzi and Bermúdez de Castro (2004) examined the enamel extension rate in anterior teeth of Neandertals and Upper Paleolithic/Mesolithic modern humans. They noted not only that the perikymata packing pattern found in Neandertals and the modern human sample differs, but also that the Neandertals developed their anterior teeth in a shorter amount of time. Further research (Guatelli-Steinberg et al. 2005), using comparative dental samples of modern humans from England, South Africa, and Alaska, found that Neandertals grew their anterior teeth faster than Alaskan Inuits, about the same as the sample from England, and slower than those from South Africa. Although periodicity in the perikymata varies fairly widely in modern humans (6–12 days), these authors reject an extremely low periodicity rate (of less than 6 days) for Neandertals, arguing that the rate of development of the Neandertal anterior dentition falls within the modern human range of variation (see also Braga and Heuze 2007).

Turning to the postcanine teeth, work by Reid et al. (2008) revealed that Neandertal premolar enamel formation time also fell within the range of variation of their modern samples. A study by Macchiarelli et al. (2006) found that a Neandertal mandibular M1 from La Chaise had a shorter crown formation time than modern humans from South Africa, but the root extension rate led them to estimate an emergence time of 6.7 years—which falls within the modern range of human variation. In contrast, an analysis of the estimated 8-year-old Scladina Neandertal by Smith et al. (2007) found that maxillary M1 and M2 formation times were much faster in Neandertals than in modern human samples from northern Europe and South Africa. Their work on anterior teeth supports the findings summarized above with the exception of the lower canine, which also formed quickly. Smith et al. (2007) suggest that the M1 of the Scladina juvenile erupted before the age of six (results that conflict with those from La Chaise, above), that the M2 likely erupted before 8 years of age (much earlier than the 10–13 years seen in modern humans according to Liversidge 2003), and that the crown of the third molar (three-quarters complete) is in an advanced state of development. A fast-developing M3 is also supported by work on the Lakonis Neandertal (Smith et al. 2009). Bayle et al. (2009a) examined the Roc de Marsal juvenile and found the mineralization of the M1s to be advanced compared with the incisors, which were delayed. They also found that both deciduous and permanent mandibular dental formation sequences fell outside their modern comparative sample.

In contrast to the studies of Neandertal dentition, research on the early modern human child from La Madeleine found a dental maturational sequence that fell within the modern human range (Bayle et al. 2009b). This finding agrees with the work of Smith et al. (2007), who found that the enamel and root formation of an approximately 8-year-old early modern juvenile from Morocco, Jebel Irhoud 3, dating to about 180,000 years ago, followed a modern human pattern of dental growth.

These recent papers support the hypothesis that at least some Neandertal teeth, particularly the second and third molars, formed faster than in some recent modern human populations. Also, work by Macchiarelli et al. (2006) indicated morphological differences between Neandertals and humans in their enamel-dentine junction and the timing of their peak root extension rate. Olejniczak et al. (2008) also found morphological differences in molar enamel thickness between Neandertals and modern humans. Research by Bayle et al. (2009a, b) found structural differences in dental tissue proportions of Neandertals but not early modern humans. Moreover, although Guatelli-Steinberg et al. (2007:83) found that Neanderthal anterior teeth formed at the same rate as in some modern human groups, they maintain that the unique distribution and spacing of surface perikymata were the result of differences in enamel growth processes. The key implication of these studies for the understanding of Neandertal growth and development is that the use of traditional dental aging techniques (crown and root formation) based on modern human reference samples will systematically over-age Neandertals in the middle childhood and adolescent ranges (that is, those individuals whose ages are based on the development of their second and third molars), but the dental ages of younger individuals should be relatively unaffected.

The fact that early modern human juveniles followed a modern human pattern of dental growth makes it likely that the modern pattern of dental development evolved only recently. However, as Guatelli-Steinberg (2009:16–17) cautioned, not only does the range of variation in modern humans need to be explored more fully to test this assertion, we also still do not fully understand the link between dental formation and life history (e.g., Robson and Wood 2008). Furthermore, as stated above, in the absence of associated skeletal referents, we do not know if the rate of dental development is an accurate indicator of the rate of somatic growth.

Somatic Growth

Early childhood is thought to have been present in Neandertals based on a similar period of stasis between the eruption of the lateral incisors and the second molars to that which occurs in modern humans, corresponding to the early childhood phase (Hellman 1943; Mann et al. 1996). This period of stasis is not found in apes, indicating that the Neandertals’ growth period was likely extended relative to earlier ancestors.

Adolescent growth in Neandertals is not currently well understood. Two important specimens that shed light on Neandertal juvenile and adolescent growth are Le Moustier 1, from France (c. 40,000 years ago), and Teshik-Tash 1, from Uzbekistan (c. 70,000 years ago). Like the H. erectus Nariokotome boy, Le Moustier 1 and Teshik-Tash 1 have both skeletal and dental elements preserved. Thompson and Nelson (2000, 2005a; Nelson and Thompson 1999, 2002, 2005; Thompson et al. 2003) initially calculated a dental age of ca. 15.5 years for Le Moustier 1 based on M3 formation (assuming a modern human rate of dental development), which might suggest that he was an adolescent when he died. However, Le Moustier 1’s femoral length is comparable to that of a modern 10.5-year-old boy (Thompson and Nelson 2000), an age more fitting with a position in the middle childhood stage. A fragment of the ischio-pubic ramus indicates that fusion of the ischium and pubis had been initiated but was not yet completed (Thompson and Nelson 2005b). These pelvic bones usually begin uniting in modern humans by the eighth year (Scheurer and Black 2000) but may not become completely fused until later. The proportion of adult stature achieved by Teshik-Tash 1 closely approximates that of a modern human male of 6–7 years of age (Nelson and Thompson 1999), although dental estimates range between 7 and 10 years of age (see below).

If Le Moustier 1 and Teshik-Tash 1 were following a modern human schedule for skeletal growth and were about 10.5 and 6.5 years of age, respectively, then their dental ages (15.5 and 7–10 years, respectively) were accelerated relative to modern human standards. This conclusion is consistent with the discussion outlined above suggesting that subadult Neandertals in the middle childhood or adolescent age ranges would be over-aged using modern human dental criteria. If this is indeed the case, then the points for Teshik-Tash 1 and Le Moustier 1, the two Neandertals in these developmental stages, should be shifted to the left in Fig. 2. Figure 3 is a modified version of Fig. 2, with new dental age estimates for Le Moustier 1 and Teshik-Tash 1. Le Moustier 1 is plotted with an age of 11 and Teshik-Tash 1 with an age of eight. The age-at-death estimate of 11 for Le Moustier 1 is based on a consensus of the skeletal growth age outlined above, the dental advancement hypothesis, as well as histomorphometric analysis (Ramsay et al. 2005) and new research undertaken using synchrotron technology to examine dental microstructures (Hall 2008). The age for Teshik-Tash 1 is at the low end of most age estimates published for this specimen (e.g., 8–10 years; Coqueugniot and Hublin 2007) but is consistent with the model of dental advancement and with some of the earliest age estimates (e.g., Ullrich 1955, who estimated an age range of 7–9 years). In Fig. 3, the ages for the two youngest Neandertals remain as they were in Fig. 2 because there was apparently no dental advancement in those earlier growth phases. Figure 3, which plots the new dental age estimates for Le Moustier 1 and Teshik-Tash 1 against proportional femoral length, suggests that the Neandertals underwent proportional growth of the femur in a manner that is similar to that of modern humans. This conclusion is quite different from the one we presented in Thompson and Nelson (2000), where we argued that Neanderthals grew either their teeth faster than in modern humans or their femora over a longer duration (Fig. 2).

Fig. 3
figure 3

Modified version of Fig. 2 using revised, lower age estimates for Le Moustier 1 and Teshik-Tash

The question remains: What about the duration of growth? We know that in both apes and humans, dental and skeletal growth normally ceases at approximately the same time (Anemone et al. 1996; Kuykendall and Conroy 1996; Leigh and Shea 1996; Simpson et al. 1996; Tanner 1990). Given an estimate of 11 years of age for Le Moustier 1 and the advanced stage of formation of the third molars (Bilsborough and Thompson 2005), and assuming that skeletal growth would end with the complete formation of the third molar and that Neandertals would have taken the same amount of time to complete M3 development as a modern human (ca. 2–3 years), then the Le Moustier boy would have completed his growth by approximately 13–14 years of age. Thus, it is possible that growth in Neandertals terminated in the mid-teens, suggesting that the growth period had become extended in Neandertals by a few years over that of H. erectus, but perhaps not as long as in modern humans (who finish growth ca. 18–20 years of age; e.g., Robson and Wood 2008).

Since the middle childhood (juvenile) stage is delayed in modern humans as compared with apes, resulting in a delay in the adolescent stage with its pervasive growth spurt, it is appropriate to assess whether the adolescent stage is also similar in Neanderthals. In an earlier paper (Nelson and Thompson 2002), we used comparative data from “warm-adapted” (Khoisan) and “cold-adapted” (Inuit) modern populations to determine whether climate had an effect on long-bone growth in early modern humans and Neandertals. Our results indicated that Upper Paleolithic individuals closely followed the ontogenetic sequence of the warm-adapted population, whereas Neandertals followed that of the cold-adapted Inuit (see Nelson and Thompson 2002 for a list of specimens used). We also wanted to determine whether the adult forms of these species were produced by ontogenetic trajectories expressed evenly throughout growth or at the onset of adolescence, with its distinctive growth spurt and marked changes in proportional growth.

A consideration of femoral midshaft area as a proxy for body size and length as a proxy for stature revealed that the overall relationship between body size and stature is clearly not linear (Fig. 4). In both warm- and cold-adapted populations, the long, shallow preadolescent slope flexes at the time of adolescence. This pattern can be described using loess regressions (as we did in Nelson and Thompson 2002) or using a pair of linear regressions: one for the juvenile individuals and a second for the adults. For our purposes here, the linear regressions suffice to illustrate the population differences. It would appear that significant robusticity is produced during, or perhaps immediately after, the growth spurt, when linear growth has stopped but shaft breadths (body size) continue to increase (see also Ruff et al. 1994). This finding is consistent with the observation that the majority of somatic growth takes place in the later adolescent growth stage (Bogin 1997).The Neandertal individuals appear to follow the Inuit ontogenetic trajectory, whereas the Upper Paleolithic individuals follow the trajectory of the Khoisan. Both the Inuit and Neandertals appear more robust (greater midshaft area for a given length) at any given age than the Khoisan or Upper Paleolithic H. sapiens individuals. A similar pattern holds for the tibia in Fig. 5, except that the warm/cold sample separation is clearer. Here the flex of the ontogenetic trajectory for most “cold-adapted” individuals appears to happen at a somewhat longer tibial length, hence perhaps at a somewhat more advanced age in adolescence. The position of the early Upper Paleolithic H. sapiens individuals is harder to interpret given the paucity of juvenile specimens. However, the Upper Paleolithic adults clearly fall comfortably within the Khoisan sample (Fig. 5).

Fig. 4
figure 4

Cross-sectional area at midshaft femur versus femoral length. This figure illustrates that the relationship between body size and stature is not linear in Inuits, Khoisan, early modern humans, and Neandertals. (from Nelson and Thompson 2002)

Fig. 5
figure 5

Cross-sectional area at midshaft tibia versus tibial length. This figure illustrates that the relationship between body size and stature is not linear in Inuits, Khoisan, early modern humans and Neandertals. (from Nelson and Thompson 2002)

In both the femur and the tibia, the ontogenetic trajectory for robusticity clearly illustrates two stages. The point of differentiation between the two trajectories appears to lie at adolescence. In both cases the adult slope is much steeper than the juvenile slope, a phenomenon also reflected in the reduced r 2 values (Table 2). The increase in slope from juvenile to adult is more extreme in the case of the cold-adapted Inuit population, indicating a relatively greater increase in body mass relative to stature during and following the growth spurt. Comparable results have been found for the humerus as well (Nelson and Thompson 2002). Thus, it is clear that in the postcranial skeleton, an adolescent growth spurt in body mass was present in both Neandertals and early modern humans. However, given the discussion above about the termination of growth in the mid-teens, it would appear that the adolescent stage and the accompanying growth spurt were compressed in Neandertals relative to modern humans.

Table 2 Regression statistics for the regressions shown in Figs. 4 and 5

Cranial Growth

An understanding of Neandertal cranial growth is very important given their very large cranial capacity. Proportional growth in linear and volumetric measurements of the cranial vault has been examined (Thompson and Nelson 2001; also see Coqueugniot and Hublin 2007; Ponce de León et al. 2008) using the same method we used to compare dental age and femoral growth (Fig. 2). The majority of cranial growth takes place within the first 8 years of a modern human child’s life, finishing during the time between the eruption of the first and second molars. Coqueugniot and Hublin (2007) suggested that the youngest Neandertals lie above the modern human range of variation, whereas Ponce de León et al. (2008) found that they were incorporated within the modern range. A close examination of the data used in these two studies suggests that the difference in conclusion lies in the modern reference samples used. Thus, a conservative interpretation of these studies would be that Neandertal proportional growth of the cranial vault followed a pattern similar to that in modern humans. However, since the end point was greater than that of modern humans, absolute vault dimensions of Neandertals were actually advanced relative to modern humans throughout postnatal growth (Thompson and Nelson 2001).

It is worthy of note that the ages for the fossils used in these cranial analysis are the same as those commonly used in the literature and do not reflect the hypothesis of the advancement in M2 and M3 development in Neandertals. However, since this hypothesis only leads to the reduction of ages of individuals who have already terminated brain growth, the essential conclusions would be unchanged. (Indeed the positions of two of the older individuals, La Quina 18 and Krapina 1, would fall more in line with the rest of the distribution than they currently do; cf. Coqueugniot and Hublin 2007: Fig. 1; Ponce de León et al. 2008: Fig. 4b. See also Thompson and Illerhaus 1998, 2000; Thompson et al. 2002, 2003 for a discussion of facial growth remaining in the Le Moustier 1 individual.)

Growth Stages in Neandertals

Infancy

A number of different lines of evidence, including isotopic analysis (Bocherens et al. 2001) and allometric analysis from maternal body size estimates (Thompson 1998), point to a weaning age of 3–4 years, which is comparable to the modern human range of variation (Dettwyler 1995).

Early Childhood

The high mean adult cranial capacity of Neandertals, which actually exceeds that of modern humans, is good evidence for the early childhood period in these hominids (e.g., Bogin 2003). The evidence of growth in cranial capacity suggests that Neandertals who lie in the early end of the age range of 6 to 8 years, and so post weaning/infancy, still had some brain growth to achieve, although adult cranial capacity was reached by 8 years of age (Coqueugniot and Hublin 2007). Mann et al. (1996) have also suggested that the dental developmental hiatus is evidence of the early childhood stage. Proportional cranial growth was similar to that of modern humans, but the higher mean outcome suggests that the Neandertals had a higher rate of brain growth during their infancy and early childhood growth stages. Patterns of dental eruption of the anterior permanent dentition during the early childhood stage appear to be roughly comparable with those of modern humans (Braga and Heuze 2007; Guatelli-Steinberg et al. 2005).

Middle Childhood

If the early childhood stage ended by the age of six to eight, the beginning of the middle childhood/juvenile period in Neandertals would be comparable to that of modern humans. The age of termination of this period is unclear owing to the lack of available adolescent individuals. The consensus age presented for the Le Moustier individual cited above is 11 years, and he does not appear to have entered his growth spurt. Thus, we estimate that the length of the middle childhood stage in Neandertals is at least as long as it is in modern humans.

Adolescence

The evidence presented above suggests that Neandertals were characterized by a short adolescent period (see also Guatelli-Steinberg and Reid 2008:248), with a very rapid growth spurt leading to achievement of adult body size around the age of 13–14 years. The adolescent period would have been shorter and characterized by higher rate of somatic growth than is the case in modern humans.

Evolutionary Summary

As outlined above, the modern human pattern of growth and development includes five stages: infancy, early childhood, middle childhood, adolescence, and adulthood. A change in the rate and amount of brain growth probably first arose in H. erectus and is seen in all subsequent hominins. The early childhood stage probably arose first in H. erectus. Middle childhood is common to our entire ancestral lineage, but its duration possibly shortened somewhat with the insertion and elongation of the early childhood phase in H. erectus (although this would be mitigated by shortening of infancy through earlier weaning). In the subsequent millennia, it appears to have regained its earlier length because of the delay in the initiation of adolescence. The adolescent stage appears to be gradually lengthened and elaborated, but the full expression of an extended growth spurt and drawn-out adolescence (including a delay in sexual maturation) appears to be unique to modern humans. As noted previously, given the mismatch between dental, postcranial, and craniofacial growth, the pattern of growth seen in H. erectus was not yet modern.

In Neandertals, all the elements of the modern human pattern of growth and development were in place, but the length and details of the growth phases in these individuals demonstrate important differences. The components of our pattern of growth and development and their accompanying details appear to have evolved in a mosaic fashion, and the total package of traits seems to have emerged relatively recently with the appearance of modern humans.

Middle Childhood in the Fossil Record: A Case Study

What can we infer from the fossil record about the life of a mid-childhood individual? Few specimens exist that can be used to address this question. However, one Neandertal individual, Le Moustier 1, probably died at the end of this stage, so his teeth and bones reveal something about his life during the middle childhood/juvenile period.

The estimated age at death for Le Moustier 1 presented above is 11 years. By this age he had already developed several skeletal characteristics enabling us to determine that he was male. His long bones were still fairly short and did not demonstrate fusion of the epiphyses, and he had not yet completed his facial growth, but he had already attained an adult cranial capacity (see Thompson 2005; Thompson and Bilsborough 1997, 2005; Thompson and Nelson 2005a, b; and other papers in Ullrich 2005a).

There are indicators that the juvenile Neanderthal Le Moustier 1 was already participating in adult activities. For example, dental wear (Bilsborough and Thompson 2005; Weinert 1925) points to the possibility that this individual’s anterior teeth were starting to be used in paramasticatory activities as tools. Also, the jaw of Le Moustier 1 had been broken but showed evidence of healing (MacGregor 1964; Thompson and Bilsborough 2005), indicating that this individual endured a sustained period during which the jaw was not fully (or normally) functional (Thompson 2005). Ullrich (2005b) noted that parietal and frontal fractures may be perimortal wounds. This physical evidence is supported by the archaeological record which indicates that adult males and females as well as juveniles participated in hunting (Kuhn and Stiner 2006). Kuhn and Steiner (2006:958) state that both “(1) close physical contact with large prey and (2) assistance to hunters by beating the bushes or otherwise reducing prey escape routes with warm bodies” would have put them in harm’s way. This may account for the broken jaw and cranial fractures seen on Le Moustier 1 and fits with the evidence that many Neandertals suffered trauma (Berger and Trinkaus 1995; Trinkaus 1995; Trinkaus and Zimmerman 1982; Wolpoff 1999) from hunting activities (see also Thieme 1997). However, we cannot rule out the possibility that the trauma was caused by interpersonal violence.

Muscle markers on bones can also reveal evidence of activities. We know from extensive research on the upper bodies and limbs that adult Neandertals were very powerfully built (e.g., Churchill 1993, 1994, Trinkaus et al. 1994). Trinkaus (1983) notes that the pronator quadratus crest is prominent on ulna of adult Neandertals (see also Galtés et al. 2009), which indicates that “powerful rotation of the forearm was important” (1983:238). Also, a pronounced curvature of the radial diaphysis is an adaptation for powerful rotation of the forearm (Trinkaus 1983). According to Crompton et al. (2008:529), “a marked pronator quadratus crest, and a strongly curved radius, may both indicate powerful pronation and supination.” These authors also suggest that “Neandertals may have been close-quarter hunters, using stabbing spears” as “encounter predators” (Crompton et al. 2008:530). In Le Moustier 1, the radius already has a pronounced curve and the ulnar pronator quadratus crest is already well-developed (Thompson and Nelson 2005a, b), providing further evidence that Le Moustier 1 participated in adult activities.

We know that modern humans during middle childhood are no longer completely dependent on their parents for survival and are able to “provide much of their own food and to protect themselves from predation and disease” (Bogin and Smith 1996:705–706). The evidence presented above suggests that the Neandertal juvenile known as Le Moustier 1 was already engaging in adult behaviors despite being in the middle childhood stage of development—perhaps more so than an equivalently aged modern human. The fact that this individual did not live to full maturity may itself be an indication of the demanding lifeways of this population of hominins (Thompson 2005).

Neandertal Middle Childhood in an Evolutionary Context

The demands of the Neandertal economic strategy were harsh. Meat from large terrestrial game animals made up most of the Neandertal diet and males, females, and probably juveniles were all involved in hunting (Kuhn and Stiner 2006). In recent hunter-gatherer societies, mid-childhood children and women make important contributions to the diet, but from sources (easily caught prey and plant foods) less risky than large game. In contrast, Neandertal mid-childhood children and adult females may have played a much larger role in hunting (Kuhn and Stiner 2006:958). This idea is supported by research that indicates that young Neandertals had high activity levels (Ruff et al. 1993, 1994; Trinkaus 1997). Sorensen and Leonard (2001:484) maintain that Neandertals’ heavy and muscular physiques would have meant extremely high dietary energy demands, especially during colder glacial periods, requiring high foraging returns. Rapid brain growth in infancy and early childhood would also necessitate high-energy foods (Leonard et al. 2003; see also van Schaik et al. 2006). Kuhn and Stiner (2006) point out that the demographic consequences of putting females and juveniles at risk during hunting would have meant Neandertal population numbers were low (Trinkaus 1995; Caspari and Lee 2004; but see Konigsberg and Herrmann 2006 and Paine and Boldsen 2006). In addition, Kuhn and Stiner (2006:959) note that hunting is unpredictable, and this lack of consistency would “place greater periodic nutritional stress on juveniles during development, again limiting reproductive potential.” Such a harsh and unpredictable environment would favor a fast life-history strategy (Brumbach et al. 2009; Ellis 2004).

Work by Gurven and Walker (2006) speaks to the consequences of high-energy dietary requirements for growth and development. They argue that although early childhood for modern humans involves extra-maternal care and can result in a shorter interbirth interval and higher fertility for the mother, it also results in higher energetic costs to the group. The extra costs of feeding several dependent children can only be met if others help feed the young (Hawkes 2003; Lancaster and Lancaster 1983; Mace 2000; see also Reiches et al. 2009 for a discussion of energetic costs to the mother), or if subadults help provide the energy needed to fuel their own growth. Gurven and Walker (2006) examine the energetic demands on parents with multiple offspring by simulating subadult growth trajectories. There seems to be a trade-off between growing fast and maturing earlier (like apes) versus growing slower and maturing later (as in modern humans) (Bock and Sellen 2002; Brumbach et al. 2009; Janson and van Schaik 1993; Leigh and Blomquist 2007; Williams 1966). For Neandertals, with high energy needs, neither growth trajectory seems likely. To grow fast and get big early would only make sense if food was plentiful and easy to obtain, which would not always have been the case (Mellars 1996). Given the proposed Neandertal daily energy expenditure (3,000–5,500 kcal/day; Sorensen and Leonard 2001), fast early growth would have created a costly energy burden for adult Neandertals (see also Froehle and Churchill 2009). So too would having to provide food for slow-growing, later-maturing individuals. It would make more energetic sense for Neandertals to grow at a more constant rate over the early and middle childhood stages and for individuals of middle childhood age to contribute to the group’s diet (i.e., produce energy) earlier (or in greater amounts) than seen in modern hunter-gatherer groups. It may be that an extension of the duration of growth and delay of reproduction could only occur when early adult mortality rates declined with the appearance of early modern humans. In other words, the Neandertal socioeconomic strategy required juvenile Neandertals to participate in risky adult activities early to help offset high energy demands both of the group and of growth. These activities would have contributed to raise juvenile mortality rates, another important factor affecting life history (Migliano et al. 2007; Stearns and Koella 1986; van Schaik et al. 2006; Walker et al. 2006). Neandertal life history included a short, early childhood phase in which brain growth superseded body growth for some time, and that brain growth was fed by a high-energy diet consisting of meat provided by the parents. This phase may have been followed by more constant linear growth during middle childhood, culminating in a rapid somatic growth spurt during a compressed adolescent stage, to reach maturity/adulthood in a shorter time frame in order to channel energy into the production of offspring as soon as possible (Charnov and Berrigan 1993).

How do these conclusions affect our interpretation of the evolution of modern human life history? Modern humans are characterized by having earlier weaning, a longer preadult stage, longer lifespan, and a longer postreproductive period than our closest primate relatives. Several hypotheses have been proposed to explain these distinct life-history traits, including the embodied capital hypothesis (Kaplan and Robson 2002; Kaplan et al. 2000, 2003) and the grandmother hypothesis (Hawkes 2006b; Hawkes et al. 1998; see Hawkes 2006a and van Schaik et al. 2006 for discussion of other evolutionary hypotheses). The embodied capital hypothesis proposes that brain size increased in tandem with longer lifespan as hominins such as H. erectus shifted their foraging strategies to include nutrient-dense foods that were high-quality but difficult to acquire, emphasizing learning and surplus production (Hawkes 2006b; Kaplan et al. 2003; Paine and Boldsen 2006). In contrast, the grandmother hypothesis highlights the role of postreproductive women in provisioning children after weaning (early childhood) so mothers could conceive earlier (shorter interbirth interval) and thereby increase the number of offspring produced over the mother’s lifespan. Key to both these hypotheses is the assumption that juvenile (pre-adult) and adult mortality was low (see Blurton Jones 2006; Paine and Boldsen 2006:309; Sellen 2006:188). Contra Hawkes (2006a:71, 2006b:121), we do not assume that Neandertals did not have the potential to live long lives. Instead, we reason that the challenges of mid-late Pleistocene ice-age conditions within which Neandertals evolved and lived presented selective pressures unlike those seen in most contemporary hunter-gatherer populations. The high metabolic demands of their bodies, in conjunction with the high dependency on animal protein, resulted in a Neandertal foraging strategy dependent on all able-bodied individuals participating in food acquisition. This strategy put both juveniles (as seen in the Le Moustier boy) and adults at risk. We hypothesize that this foraging strategy meant, as also posited by Kennedy (2003), that although some individuals may have survived to older ages, the median age at death was lower than seen in living hunter-gatherer groups. Thus, while Neandertals and modern humans had a similar maximum potential lifespan as predicted by their brain to body weight ratios (Sacher 1959, 1975), we think that many Neandertals did not reach older ages. It is more likely that the probability of living to older ages was higher in populations of early H. sapiens (sensu stricto), changing the paleodemographic profile of that species. This might help explain that species’ geographic expansion within and out of Africa during the late Pleistocene. Instead of mothers and mid-childhood offspring participating in hunting, we speculate that a change to a more broad-based hunting and gathering strategy afforded some protection to mothers and children (in middle and early childhood) which in turn paved the way for the life-history pattern we see today.

Conclusions

This review of the fossil record leads us to the conclusion that the modern human pattern of growth and development evolved in a mosaic fashion, the full expression of which is only seen in modern Homo sapiens. Infancy in apes is longer than it is in modern humans, and once weaned, apes pass directly into the juvenile/middle childhood stage with a brain that has completed its growth. In humans, infants are weaned earlier (with the consequence for the mother of having shorter interbirth intervals) but are still dependent on older individuals for food and protection until brain growth is complete and they enter into the middle childhood stage. This means that the infancy period in modern humans has become foreshortened to allow for a childhood phase in which human offspring require high-energy food to meet the needs of a rapidly growing brain. The early childhood phase is an evolutionary novelty that is not present in other primates and probably first appeared in H. erectus. Middle childhood, or the juvenile stage, is a life history stage apparently shared by modern humans and apes. Assuming that it begins with M1 eruption and/or the achievement of 90% of adult brain size, this period begins at about age four in chimpanzees and at about 6 years of age in modern humans. Middle childhood ends at puberty, which occurs slightly earlier in apes than in modern humans. Perhaps the most marked changes in our program of growth and development are (a) the full expression of a long adolescent growth spurt in modern humans that affects the entire skeleton and (b) an elongation of this entire stage, further delaying sexual maturation and reproduction.

One key outcome of this review of the evolution of the pattern of growth and development in hominin evolution is that while the juvenile period of apes and the middle childhood period of modern humans share the general characteristics of occurring after the completion of 90% brain growth and before puberty, and during which time the offspring are somewhat independent of their parents, the course of evolution has wrought several changes to this particular life history stage. First, in earlier hominins the overall rate of somatic growth must have been faster than ours, in order to attain adult stature in a shorter period of time. The slower growth in modern humans during the middle childhood period was made possible by delaying the majority of somatic growth to the adolescent stage, resulting in a longer overall period of maturation. We fully recognize that modern humans exhibit a range of variation in the duration of life history stages, with some populations experiencing shorter or longer adolescent periods, but we argue that the possibility of a long adolescence and substantial delay in reproduction only appeared with H. sapiens. Second, at the beginning of the middle childhood stage modern humans undergo a mid-childhood growth spurt, adrenarche (Campbell 2006), and the “5- to 7-year-old shift in cognition.” According to Bogin (1999:181), this involves “shifts in cognitive capacities, self concept, visual/perceptual abilities, or social abilities” that are found in all modern human populations, but not in other primates. We cannot be sure exactly when this shift evolved, but we predict that it was already present in Neanderthals given the apparent involvement of middle childhood/juveniles like Le Moustier 1 in hunting/adult activities.

Third, the adolescent period in modern humans is behaviorally like the juvenile period of apes, in which individuals are engaging in behavior that approximates that of adults. This could represent part of an adaptive strategy that both lowered energetic costs to the group and required less risky behavior of individuals in the middle childhood phase. If early weaning and shorter interbirth intervals are tied to the evolution of allocare, enabling a mother to increase her reproductive fitness, the consequence must be a demographic one. Increased cooperation and larger extended families (Lancaster and Lancaster 1983) might mitigate the need for mid-childhood offspring to be involved in hunting and instead adolescents may have taken on this role.

Finally, many of the cultural/behavioral characteristics that modern researchers use to characterize this stage are probably themselves recent developments that accompanied the acquisition of larger brains and the increased intelligence, use of language, and more flexible behavioral repertoires that appear to have accompanied brain size increase. Others are likely to have arisen with evolutionary changes in social organization. We do not know what the social organization of the African ape/human ancestor was, but it is likely to be more similar to the unimale polygyny expressed by modern gorillas or the multimale polygyny expressed by modern chimpanzees than to any of the social structures expressed by modern humans. Thus, the cultural/behavioral roles and expectations for both parents and children in any developmental phase would have changed substantially over time.

In sum, while the broad ethological context suggests that the juvenile period of primate growth and development is equivalent to that of the middle childhood stage of modern human growth and development, particularly in its duration, a closer look at the evolution and characteristics of these two stages suggests that they should be kept conceptually and terminologically separate, and that many of the characteristics used to define the middle childhood stage (e.g., White 1996) and how this developmental stage relates to the preceding infancy/early childhood stages and the following adolescent stage are unique to modern humans.