Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

There have been many attempts to reconstruct the behavior and ecology of our earliest ancestors. The most common theory and the one that is widely accepted today is the “Man the Hunter” hypothesis. Cultural anthropologist Laura Klein expresses the current situation well: “While anthropologists argue in scientific meetings and journals, the general public receives its information from more popular sources … In many of these forums, the lesson of Man the Hunter has become gospel” (2004:10). However, this theory of early hominin behavior is still widely debated within the anthropological community and, as we will show, the evidence to support it remains controversial.

Raymond Dart launched the killer ape-man scenario in the mid-twentieth century with the help of the playwright Robert Ardrey and his best selling book, African Genesis (1961). Dart had interpreted the combined accumulation of fossilized long bones from savannah herbivores and damaged hominin skulls found in South African caves as evidence of an entrenched human hunting culture. The fact that the skulls were battered in a peculiar fashion led to Dart’s firm conviction that violence and cannibalism on the part of killer ape-men formed the basis from which our own species eventually evolved. In his words, early hominins were “carnivorous creatures that seized living quarries by violence, battered them to death, tore apart their broken bodies, [and] dismembered them limb from limb, greedily devouring livid writhing flesh” (Dart, 1953:201).

Man the Hunter, as a vignette of our species’ ecological status, purports to be based on science. But if Man the Hunter is truly a scientific theory, then what is the evidence? Is it really possible that smallish, upright creatures with flat nails instead of claws and relatively tiny canine teeth, with no tools or weapons for millions of years, could have been deadly predators?

Mammalian terrestrial predators—the carnivores—are taxonomically, skeletally, physiologically, and behaviorally distinct from primates. There are 7 families, 92 genera, and approximately 240 species in the order Carnivora which includes the customarily meat-eating dogs, bears, raccoons, weasels, mongooses, hyenas, and cats. Carnivores possess four- or five-clawed digits per limb and a non-opposable and sometimes absent pollex and hallux (Nowak, 1991). Wrist bones are fused together forming the strong scapholunar bone, unlike primates in which the bones remain independent (Macdonald, 1984). Temporalis and masseter muscles of the carnivore jaw can exert tremendous force for stabbing prey and cutting flesh (Macdonald, 1984). Dentally, canine teeth in carnivores are strong, recurved, pointed, and elongate; premolars are adapted for cutting; molars have sharp, pointed cusps; and carnassials—a key feature of the Carnivora—are specialized shearing mechanisms composed of the last upper premolar and the first lower molar (Nowak, 1991). A few species of the Carnivora (e.g., pandas) are largely vegetarian and their molars have reverted to the grinding surfaces found in primates (Macdonald, 1984). Unlike the visual cues used by the haplorhine primates, scent is an important intraspecific communication method in carnivores; urine, feces, and exudate from odorous skin glands convey information (Macdonald, 1984). Most carnivores are solitary or associate in pairs or small groups (Nowak, 1991). Although social predators exist—such as lions, wolves, spotted hyenas, and some mongoose species—their sociality is complex and no one selective pressure is the sole force for formation of groups (Macdonald, 1984).

Many human traits, such as bipedalism, monogamy, territoriality, tool use, technological invention, male aggression, group-living, and sociality, are often linked to the perspective of Man the Hunter. However, while theories and associations of human aggressive hunters abound, they are rarely based on the following three evidentiary approaches that shed light on early hominin ecology and behavior: living primate models, extant human hunter-gatherers, and the fossil record. When we investigate these three, a different view emerges.

As we have detailed elsewhere (Hart and Sussman, 2005, 2009), the diversity of large carnivores was extensive in African prehistory. Many groups of carnivores that are now extinct (e.g., huge short-faced bears and sabertoothed cats) preyed on hominins in Africa, especially between 6 and 3.5 million years ago. Then at about 3.5 million years ago, eight new genera of carnivores evolved to join the previous groups, resulting in potentially as many as eight to ten different species of sabertoothed cats, false sabertoothed cats, conical-tooth cats (large felids still represented today by leopards and lions), giant hyenas, large wolf-like canids, or short-faced bears roaming the same African sites where we now find hominin fossils (Treves and Palmqvist, 2007, see Fig. 3.1).

Fig. 3.1
figure 3_1_215508_1_En

A time span comparison of ancient African predators and hominins. (C. Rudloff, redrawn from Treves and Palmqvist 2007 expressly for this chapter)

At about 1.8 million years ago, the archaic flesh eaters, such as the sabertoothed cats, went extinct probably due to climate change, but that did not leave a dearth of large carnivores to prey on early hominins. Consider the fossil evidence for predation that has been so far discovered: C. K. Brain, a South African paleontologist like Dart, started the process of relabeling “Man the Hunter” as “Man the Hunted” when he slid the lower fangs of a fossil leopard into matched punctures in the skull of a 2-million-year-old australopithecine (Brain, 1981). The paradigm change initiated by Brain continues to stimulate reassessment of hominin fossils. Dart’s initial find, the cranium of an australopithecine (called the Taung child), who died approximately 2.5 million years ago, has been reassessed repeatedly (Berger and Clarke, 1995). Relying on new African crowned hawk eagle (Stephanoaetus coronatus) predation research carried out in the Tai Forest, Côte d’Ivoire by McGraw et al. (2006), the Taung cranium was compared to the remains of similarly sized African monkeys eaten today by these powerful raptors. The eagles are known to clutch their prey’s head with sharp talons, leaving consistent grooved signatures on the remains. New features, based on analyses of monkey prey never before described, include punctures and ragged incisions in the base of the eye socket where the raptors have ripped out the eyes of dead monkeys with their talons and beaks to get at the brains. The identification of these same singularly curious marks on the Taung cranium has provided substantiation for theories of raptor predation on this famous fossil (AP, 2005) (Fig. 3.2).

Fig. 3.2
figure 3_2_215508_1_En

New evidence from crowned hawk eagle studies has provided substantiation that raptor predation was involved in the Taung child’s demise

As shown in Table 3.1, the list of fossils showing evidence of predation continues to grow. Orrorin tugenensis, a hominin who lived over 6 million years ago, shows signs of having died from leopard predation. Ardipithecus ramidus remains found in the early 1990s at Aramis, Ethiopia, indicate that many predatory animals were sharing the site with these 4.4-million-year-old hominins. A review of A. ramidus noted: “We interpret the physiographical setting to have been a flat plain with little topography where scattered carcasses of medium and large mammals were ravaged by carnivores … Carnivore tooth marks scar the hominid cranial and postcranial elements and are ubiquitous on medium and large mammal bones in general” (WoldeGabriel et al., 1994:332). (The full list of predators found in conjunction with A. ramidus fossils includes crocodiles, pythons, hyenas, wild dogs, conical-toothed cats, sabertoothed cats, and short-faced bears.)

Table 3.1 Hominin fossils evidencing predation

The Dmanisi site in the Republic of Georgia entombed a 1.75-million-year-old hominin skull exhibiting punctures from sabertoothed cat fangs. At Orce, Spain, what appears to be hominin remains dated at 1.6 million years have been found in the den of an extinct hyena species. A 900,000-year-old member of the genus Homo from Olorgesailie, Kenya, shows carnivore bite marks on the browridge. Cannibalism as a lifestyle for one species of human ancestors was inferred by the disfigurement of faces and foramina magna found in a 450,000-year-old cache of Homo erectus skulls from the Zhoukoudian cave in China. The initial explanation of these strange manipulations was through the lens of the “Man the Hunter” paradigm. Nevertheless, studies by Boaz and Ciochon (2001) show that a more substantive explanation involves predation by extinct giant hyenas (Pachycrocuta breviostris) that crunched their way into the lipid-rich brains of hominin prey. Yet another hyena casualty may be the South African “Florisbad cranium,” a late archaic H. sapiens approximately 260,000 years old. A Neanderthal skull from 50,000 years ago found at Monte Circeo, Italy, is also apparently the victim of hyena predation. While previously classified as a fatality from cannibalism, the fossil man of Monte Circeo was deposited at death in an active hyena den; the skull displays fractures consistent with hyena’s tooth marks, evidences an enlargement of the foramen magnum consistent with hyena predation, and exhibits gnaw marks on the jawbone.

The world of ancient hominins was replete with large mammalian predators, raptors, and reptiles, and there are strong indications that hominins were regularly hunted. In a seemingly uninterrupted legacy of our past, it can be documented that, outside the West, no small amount of predation has occurred on humans in modern times. We may not have seen these figures in newspaper headlines, but 612 people were killed by tigers in the Sundarbans Delta of India and Bangladesh in the decade from 1975 to 1985 (McDougal, 1991), and over 200 humans were attacked by leopards in one Indian state between 1988 and 1998 (Uprety, 1998). Chinese biologists suspected that brown bears killed 1,500 farmers annually in the Tibetan Plateau when it was opened up to agriculture (Domico, 1988), while an estimated 3,000 individuals are seized or eaten by crocodiles each year in sub-Saharan Africa (Alderton, 1991). After researching death records, zoologist Hans Kruuk (2002) could document that wolf predation is still a fact of life in Belarus and several other Eastern European nations.

Besides the fossil record, another reliable source to consult about our evolutionary past is extant nonhuman primates. A study of predation on nonhuman primates found that 178 species of predatory animals included primate prey in their diets (Hart, 2000). These ranged from fierce, tiny birds to huge 500-pound crocodiles and scores of animals in between—tigers, lions, leopards, jaguars, jackals, hyenas, genets, civets, mongooses, pythons, komodo dragons, eagles, hawks, owls, and even toucans. The level of predation endured by chimpanzee and gorilla populations provides another layer of authenticity to our background as prey—after all, these are our closest genetic relatives. The evidence of a gorilla meal found in leopard feces in the Central African Republic (Fay et al., 1995) proved that the largest primates are subject to predation. Chimpanzees, despite their obvious intelligence and strength, are no match for leopards or lions; 5–6% of chimpanzee populations are consumed annually by these wild cats at two African sites where predation was studied (Boesch, 1991; Tsukahara, 1993).

Our fossil relatives are said by many to have focused on acquisition of meat to the point that all major evolutionary adaptations can be traced to that particular craving. Nevertheless, as explained previously in this chapter, hominins cannot be categorized as carnivores; we and our fossil relatives have dentition and gut tracts very like our omnivorous but mainly like fruit-eating, primate relatives. Inflated importance for meat in the early hominin diet may have been derived from reports of increased red colobus monkey hunting and meat eating observed in selected chimpanzee populations (Fourrier et al., 2008). Yet, in one study of overall chimpanzee diets, meat from mammal prey was found to be less than 0.5% (Hladik, 1977); this was confirmed at Gombe and Tai Forest research sites by Boesch and Boesch-Achermann (2000) who state that meat is necessary neither for survival nor normal growth. In captivity, chimps are not meat eaters, possessing neither the oral nor dental morphology to chew meat efficiently (Milton and Demment, 1989).

Unless the meat is cooked, hominins do not possess the teeth or the gut tract to digest herbivore muscles (i.e., the raw meat that has typically been imagined as acquired through hunting or scavenging), and red meat cannot be cooked unless fire is available on demand and weapons exist to regularly kill large animals. Our teeth have remained much the same throughout the 7 million years of hominin evolution, and they are not the teeth of carnivores. Teaford and Ungar stress that “The early hominids were not dentally preadapted to eat meat—they simply did not have the sharp, reciprocally concave shearing blades necessary to retain and cut such foods” (2000:13509). Humans do not depend on their canine teeth to tear off or chew meat, and like other plant eaters, the human jaw can easily move backwards and forwards and from side to side for biting and grinding plant material, unlike carnivores who have fixed lower jaws permitting only open-and-shut movement thus adding stability and strength to their bites (Nowak, 1991).

Our gut tract is also basically the same design as fruit-eating primates (Hladik et al., 1999). We fall into the category of unspecialized frugivores when our digestive tract and body size are compared with other primates and meat-eating mammals; this nonspecialization allows for the large variations found in human diets (Hladik et al., 1999). Cooking allows humans to masticate and digest muscle fiber, but meat could neither be cooked nor become a regular dietary component for hominins until fires could be readily ignited and controlled. The first verifiable archaeological evidence of controlled fire has been found in Israel and dates to approximately 790,000 years ago (Goren-Inbar et al., 2004); prior to that time there were only tenuous indications of fire that can be as logically explained by natural phenomena as they are by hominin fabrication. Dates as early as 1.8 million years ago for Swartkrans in South Africa and 1.5 million years ago for Koobi Fora and Chesowanja in Kenya have been offered as substantiation of early hominin mastery over fire for light, heat, and cooking, but exhaustive critiques have found that without the unequivocal evidence of hearths, the early sites cannot be attributed to anything but to naturally ignited fires and smoldering vegetation (James, 1989; Klein, 1999).

There is little possibility that tools were available to include much meat as a dietary component before the advent of weapons. The first evidence of a javelin-like spear (which might be thrown as a hunting weapon) is 400,000 years old, but the effectiveness of the Schöningen spear against large herbivores is questionable since it has been likened to an “oversized toothpick” (Klein and Edgar, 2002:160). In fact, conservative interpretations of the archaeological evidence do not uphold the appearance of human hunting until the fairly recent past. Klein (1999) states that true large-scale, systematic hunting may not have made an appearance in human history until 60,000–80,000 years ago. While the Schöningen spear was found with the bones of horses, many of which showed evidence of butchery, Klein and Edgar (2002) maintain that artifacts at Schöningen demonstrate that the ancient people living there obtained some large animals but they question whether this was a regular event. To assess how successful the alleged hunters might have been, it is necessary to place the butchered bones in the context of all the bones at the site that do not evidence human manipulation and that do evidence carnivore teeth. Precise investigations suggest that as relatively recently as 500,000 years ago, human ancestors were not obtaining large mammals very often (Klein and Edgar, 2002). The previously proclaimed “kill” sites in Africa and Europe from this period, when subjected to rigorous analysis, do not substantiate large-scale human hunting. Klein and Edgar (2002) offer Duinefontein 2 (a 300,000–year-old South African site) as an example of the misleading cues posed by human tools and animal bones lying side by side. After meticulous examination of the bones and artifacts at Duinefontein 2, it became clear to the researchers that tool marks on animal bones were rare compared to carnivore tooth marks. These data were cross-checked against a much older South African site (Langebaanweg, dated at 5.5 million years) located only 36 miles away where no hominin presence has been found (and millions of years prior to the advent of tools). Data from the two sites are similar; carnivores were definitely eating large mammals, but ancient humans at Duinefontein 2 were having a negligible impact. Fresh examination of Ambrona and Torralba in Spain and Elandsfontein in South Africa demonstrates the same paucity of tool-marked bones and lack of real evidence for hunting (Klein and Edgar, 2002).

In the early 1980s, at Olduvai Gorge in Tanzania, Bunn (1981) and Potts and Shipman (1981) discovered large mammal bones with both carnivore tooth marks and cut marks that appeared to have been made by hominins with stone tools approximately 2 million years ago. These findings reinforced the idea that meat eating by early hominins, either from hunting or from scavenging, played an important role in human evolution. While “Man the Hunter” enjoyed popularity in the scientific community for many years, in the period of post-1980s “Man the Scavenger” garnered ardent supporters. Many archaeologists have relied on taphonomy to determine whether the distribution pattern of cut marks and tooth marks could tell us if hominins were hunters, aggressive power scavengers (i.e., hominins who mobbed carnivores and stole kills), or passive scavengers, but interpretations were often poles apart. For example, Blumenshine (1988, 1995) believed the evidence showed that large carnivores had first crack at the carcasses, indicating that hominins were passive scavengers. To Dominguez-Rodrigo (1997, 1999), the distribution of cut marks implied that early hominins had first access to the bones and thus the bones were the remains of hunting or aggressive scavenging. (Of course, passive scavenging and power scavenging are not mutually exclusive, and neither are hominins as scavengers and hominins as prey.)

More recently, however, Lupo and O’Connell (2002) have reexamined all the evidence used in these earlier studies. They compared the cut marks and tooth marks on the fossil bones with data on real-life hunting and scavenging carried out by modern East African foragers, the Hadza of Tanzania. While there is some relationship between cut mark and tooth mark distribution as well as order of consumer access (humans first versus carnivores first), it is not as clear cut as had been previously suggested, and there are a number of reasons why. First, cut marks and tooth marks have not been defined in the same way by the various researchers seeking to collect evidence for “Man the Scavenger.” Secondly, procedures for reporting frequencies of tooth or cut marks are not standardized. Finally, there are significant differences between patterns observed in modern control samples and those reported on the bones from fossil sites.

In light of difficulties such as these, it is apparent that verification of a “Man the Scavenger” hypothesis is elusive—not because the studies are deficient but because the situation is terrifically complex. On this subject, Klein has said: “Again we must turn to logic, supplemented in this instance by studies of recent hunter-gatherers. These studies suggest that Oldowan people [two million years ago] relied mostly on plants and perhaps on other gathered foods such as insects. In light of this, their day-to-day food quest was probably far less bloodthirsty than some popular accounts have proposed” (1999:248).

An experiment in scavenging was carried out by Louis S. B. Leakey in the 1960s when he and his son Richard tried to forcibly take kills from predators (Munger, 1971). Leakey reported that it was impossible for them to keep the lions away, and the hyenas could only be held at bay for a very short time. As the Leakeys’ discovered, stealing carcasses would be an extremely involved activity. The process increases the likelihood of becoming prey and so entails the need for threatening actions that carnivores and other scavengers will respect; it also requires processing the carcass while defending it and necessitates transporting the meat chunks while being pursued by irate predators and other scavengers (Treves and Palmqvist, 2007).

Another complexity not factored into “Man the Scavenger” scenarios is the reality of the condition of dead animals. DeVault et al. point out, “Contrary to widespread belief, vertebrate scavengers consume very few carcasses from predator kills because predators usually consume entire animals or guard their prey. Therefore, most scavengers rely on animal deaths due to malnutrition, disease, exposure, parasites, and accidents” (2003:226). “Man the Scavenger” has support in the scientific arena even though hominins possess none of the internal physiology or external structure necessary to ingest putrid meat, which is what real and facultative scavengers manage to do with the anatomical equipment they possess. These species have evolved detoxifying enzymes along with bodily structures and metabolic processes that protect them from harmful bacteria (DeVault et al., 2003). As stated by Ragir et al.:

The primate digestive strategy combines a rapid passage through the stomach and prolonged digestion in the ileum of the small intestine and caecum, and this combination increases the likelihood of colonization of the small intestine by ingested bacteria that are the cause of gastrointestinal disease. Carrion is very quickly contaminated with a high bacterial load because the process of dismemberment of a carcass exposes the meat to the bacteria from the saliva of the predator, from the digestive tracts of insects, and from the carcasses’ own gut. Thus, the opportunistic eating of uncooked carrion or even unusually large quantities of fresh-killed meat by nonhuman primates or humans is likely to result in gastrointestinal illness (2000:477).

Hominins may be opportunists who eat a variety of things, but with the exception of modern Westerners and Inuits, most of humanity does not eat much meat. Inuits, who have adapted over thousands of years to the coldest Arctic climates, are among the few populations who have diets high in meat, but they traditionally consume as much blubber as flesh from marine mammals (Hayden, 1981). Modern foragers outside the Arctic, such as the hunting and gathering !Kung San of the Kalahari Desert, have a diet that consists of as little as 4% meat (Tanaka, 1976). Among traditional hunter-gatherers studied in tropical and mid-latitude habitats, the most common feeding strategy was a high daily consumption of fruit, cooked rootstocks, and occasional bulbs, shoots, and young leaves supplemented by protein from all sorts of animals—turtles, lizards, insects, birds’ eggs, and larger mammals (Vincent, 1985; Blurton Jones et al., 1989; Bailey, 1993; Blurton Jones, 1993; Sept, 1994; Hawkes et al., 1995; Marlowe, 2005; Speth, 2010).

If we were not meat eaters, then are there other fallacies that are linked to the commonly accepted “Man the Hunter” answer to our past? Were our early ancestors violent, natural born killers of other species and of their own kind?

The blood-bespattered, slaughter-gutted archives of human history from the earliest Egyptians and Sumerian records to the most recent atrocities of the Second World War accord with early universal cannibalism … and with worldwide scalping, headhunting, body-mutilating and necrophilic practices of mankind in proclaiming this common bloodlust differentiator, this predaceous habit … (Dart, 1953:201).

The quote above lays out a trail that seems to lead from meat eating to hunting, then cannibalism, and ultimately into a morass of repellent activities. But the question we keep returning to after every misanthropic description asks whether views taken from a “Man the Hunter” position are supported by any scientific evidence. Often, connections to cannibalism are inferred from fossil assemblages. We find that almost all of the so-called cannibalistic sites have been lacking in evidence to support this claim. Recent less sensational analyses have not found substantiation of cannibalism but instead find evidence of natural disasters, including predation on the hominins involved. Australopithecines in South African caves and H. erectus in the Zhoukoudian Cave were thought to have been scenes of cannibalism but, as stated earlier, both involved the remains of hominins preyed on by large carnivores. At Atapuerca in Spain, the famous “Pit of Bones,” (dated at approximately 800,000 years before the present) cannibalism has been alleged to be the cause of bone deposits (Mosquera Martínez, 1998). New analyses find that the hominin bone accumulations were the result of a natural catastrophic event and—while there is skepticism for such a conclusion—the site may even represent trapping of hominins by bears (Monge and Mann, 2007).

Neanderthals, in particular, have been tarnished with the stain of cannibalism almost since their fossil remains were first discovered. “As for Neanderthals, scholars in the early part of this [20th] century assumed almost routinely that they practised cannibalism, an idea that fitted the prevailing view of Neanderthals as shambling, uncultured brutes …” (Bahn, 2005:330). Trinkaus (2000) estimates that there is only one confirmed instance of violence in the Neanderthal fossil record. He noted, “The identification of traumatic injury in human fossil remains has plagued paleontologists for years. There has been a tendency to consider any form of damage to a fossil as conclusive evidence of prehistoric violence between humans …” (p. 133). As an example of what Trinkaus describes, a single Neanderthal cranium found at Monte Circeo, Italy in a “ring of stones” had been attributed to ritual cannibalism. A more recent theory, however, suggests that the “ring” was the result of a landslide; Monte Circeo was found to be a hyena den at the time the hominin bones were deposited, and damage to the single cranium is consistent with the method used by hyenas to crush skulls and extract brains (Bahn, 2005).

There is a full century behind accusations regarding cannibalism at the Krapina Neanderthal site in Croatia. Neanderthal bones were first discovered between 1899 and 1905 when crude methods were used to excavate and preserve hominin fossils. Cannibalism was the immediate explanation for the bone deposits, but wolf, bear, and hyena remains at the site also point to predators being responsible for the hominin cache (Klein and Edgar, 2002). Although media reports continue to identify Neanderthal remains at Krapina as a confirmed “cannibal feast,” Bahn comments: “This gruesome image does not stand up to scrutiny. The bones display no evidence of the impact fractures characteristic of marrow extraction by humans. Instead, the extensive fragmentation can be explained by roof-falls, crushing by sediments, and the use of dynamite during excavation” (2005:330).

While accusations of cannibalism stretch back to Greek myths and seem to titillate the human mind, it is satisfying to find that cannibalism among humans is rare and extraordinary—in every way an exception to normal human behavior. It is prompted only by the most singular of circumstances, such as the famous instance when survivors of a plane crash in the Andes consumed their dead fellow passengers. Careful studies have found there are no reliable witnesses to ritual or habitual cannibalism, and reports of it are based on hearsay (Bahn, 2005).

In a recent volume, we have developed the hypothesis that, rather than being a predator with inherited tendencies to be excessively violent, humans evolved as a prey species (Hart and Sussman, 2005, 2009). In this theory, we propose that both nonhuman primates and humans, as well as other social-living animals, may have developed mechanisms of cooperation and sociality through natural selection. Looking at early humans as a prey species rather than as a top-level predator gives a rather different perspective to the evolution of sociality and cooperation. Independently, Treves and Palmqvist have come up with a similar conclusion: “Given the existence of numerous ambush predators between 3.6 and 1.8 ma, hominins would have experienced strong selection for efficient vigilance” (2007:370). They thus propose that early hominins “would have adopted more cohesive and calmer social organization to maintain vigilance and reduce conspicuousness to carnivores…” (p. 370). Inconspicuous groups “within which individuals cooperate in anti-predator behavior can survive under heavy predation pressure… . High levels of cooperation and reciprocity appear critical under heavy predation pressure” (p. 372).

To assess human behavior, researchers look at our primate roots where sociality may have its origin in the general benefits of mutual cooperation, strong mother–infant bonds, and the evolution of an extended juvenile period in which developing young ones are dependent on other group members. Naturally occurring opiates in the brain, whose effects are not unlike the restfulness and lessening of unease attained through opium-based narcotics (but without highs, withdrawals, or addiction), may be at the core of innate cooperative social responses (Carter, 1999; Taylor et al., 2000). These could finally explain the evolution not only of cooperation among nonrelated humans or nonhuman primates but also of true altruistic behavior and general well-being. Going one step further, recently Hauser (2006), and Bekoff and Pierce (2009), in separate volumes, have provided ample evidence of a moral toolkit in the human brain, a biological mechanism for acquisition of moral rules.

In a recent review, Sussman and Garber (2011) found that diurnal primates (lemurs, monkeys, and apes) devoted less than 10% of their daily activity budget to direct social interactions. The overwhelming majority of these interactions were affiliative and cooperative behaviors such as grooming, food sharing, huddling, and alliance formation. In contrast, aggression was rare and episodic, typically accounting for less than 1% of all social interactions. They concluded that cooperative and affiliative behaviors commonly accounted for over 90% of direct social interactions. Clearly cooperative interactions represent the overwhelming majority of primate social interactions and form the basis of individual social bonds.

Even in species in which social interactions typically account for only 2–4% of the activity budget and adult group members are not related (such as in howler monkeys, Alouatta spp.), individuals are found to exhibit consistent partner preferences from year to year (Bezanson et al., 2002; Chapters 8 and 9, this volume). These preferences are based on patterns of spatial proximity and affiliation enabling individuals to feed together in the same food patch and to develop social and mating bonds. In chimpanzees, both adult males and females have been observed to adopt unrelated infants whose mothers had died. Care of the orphaned infant by an adult in these cases was often very costly both in time and effort. Field researchers concluded that this was a clear sign of altruism (Boesch et al., 2010).

Sussman and Garber (2011) believe that researchers need to focus on the benefits of cooperation and mutualism in understanding the evolution of primate sociality. Several recent studies of primate social behavior have highlighted the role of cooperation and affiliation in determining the benefits to individuals in forming groups or subgroups of particular size and composition (cited in Sussman and Garber). Cooperation and affiliation represent behavioral tactics that can be used by individual group members to obtain resources, maintain or enhance their social position, or increase their reproductive opportunities.

Looking at physiological mechanisms that might relate to cooperative behavior, researchers have identified a set of neuroendocrine mechanisms in humans that may lead to cooperation among related and nonrelated individuals. In experiments using MRIs, mutual cooperation has been associated with consistent activation in two areas of the brain (specifically the anteroventral striatum and the orbitofrontal cortex, or OFC) that have been linked with reward processing. Rilling et al. (2002) and Rilling (Chapter 17, this volume) have proposed that activation of this neural network positively reinforces cooperative social interactions. Even more compelling, the strength of the neural response increases with the persistence of mutual cooperation over successive trials; it is, therefore, cumulative and self-reinforcing. Activation of the brain’s reward center may account for why we tend to feel good when we cooperate. On the other hand, another area of the brain, the dorsolateral prefrontal cortex (DLPFC), is involved in the exertion of cognitive efforts to overcome prepotent response tendencies. This became evident in recent experiments related to cheating, when Rilling et al. (2007) and Rilling (2008, Chapter 17, this volume) found that most subjects activated OFC when choosing to cooperate but activated DLPFC when defecting. This suggests that cooperation was the prepotent emotional response tendency and cognitive effort was required to override this tendency and cheat. However, those subjects who scored highest on a measure of psychopathic personality showed a pattern of overriding the prepotent emotional response tendency. Thus noncooperation appeared to be a function of psychopathy.

Both of the above-mentioned locations in the brain linked with reward processing are rich in neurons that respond to dopamine, the neurotransmitter known for its role in addictive behaviors. The dopamine system evaluates rewards—both those that flow from the environment and those conjured up within the brain. When the stimulus is positive, dopamine is released. In experiments with rats in which electrodes are placed in the anteroventral striatum, the animals continue to press a bar to stimulate the electrodes, apparently receiving such pleasurable feedback that they will starve to death rather than stop pressing the bar (Angier, 2002). Therefore, it appears that in some ways we may be wired to cooperate with each other (Angier, 2002:24).

Another physiological mechanism related to friendly affiliation and nurturing is the neuroendocrine circuitry associated with mothering in mammals. Orchestrating the broad suite of these bio-behavioral feedback responses is the hormone oxytocin (OT). OT has been related to every type of animal bonding imaginable—parental, fraternal, sexual, and even the capacity to soothe oneself. It has been suggested that although OT’s primary role may have been in forging the mother–infant bond, its ability to influence brain circuitry may have been co-opted to serve other affiliative purposes that allowed the formation of alliances and partnerships, thus facilitating the evolution of cooperative behaviors (Angier, 1999; Carter, 1999; Taylor et al., 2000; Carter and Cushing, 2004; Young et al., 2005). In humans OT also has been linked with increased trustworthiness (Kosfeld et al., 2005) and with the reduction of stress and anxiety (Kirsch et al., 2005).

Studies on cotton-top tamarins reveal other hormonal mechanisms critical to cooperation and affiliative behavior (Ferris et al., 2001; Snowdon, 2003; Ferris et al., 2004; Lazaro-Perea et al., 2004; Snowdon et al., 2006; Snowdon and Cronin, 2007; Chapter 18, this volume). In these small South American monkeys, males and older siblings provide essential infant care. Elevated levels of the hormone prolactin, usually associated with lactation, may be the impetus behind maternal care giving exhibited by males and siblings. Correlations of OT and prolactin levels with amounts of friendly social behavior between one adult and another also have been found. Experiments by Snowdon et al. (2006) and Snowdon (Chapter 18, this volume) indicate that high levels of affiliative hormones could result in good-quality social interactions, suggesting a reward system for positive behavior.

Many cooperative behaviors observed in primates can be explained by individual behaviors that benefit several group members (Clutton-Brock, 2002; Silk, 2002; Silk et al., 2003; Sussman and Garber, 2004, 2011). Coordinated behaviors such as resource or range defense, cooperative foraging and food harvesting, alliance formation, and predator vigilance and defense can be explained in terms of immediate benefits to both the individual and other group members. Even if the rewards for these behaviors are low level, we should expect cooperation to be common. Thus, many types of social interactions may be best understood in terms of a non-zero-sum game with multiple winners. Low-risk coalitions in which all participants make immediate gains are widespread in primates (Watts, 2002; Sussman and Garber, 2011) and may explain why nonhuman primates live in relatively stable, cohesive social groups and solve the problems of everyday life in a generally cooperative fashion. Charles Darwin had this idea long before scientific studies of animal behavior, primatology, or cooperation when he noted that natural selection would opt for “the feeling of pleasure from society” (1874:102).

Even though most nonhuman primates are highly social, investigations into the evolution of primate sociality have tended to focus on aggression and competition instead of cooperation. However, many results from behavioral, hormonal, and brain imaging studies offer a new perspective of primates and their proclivities for cooperation, sociality, and peace. For example, after 16-years research on the behavior and ecology of wild savanna baboons, Silk et al. conclude that social integration even enhances reproductive capabilities in female baboons: “Females who had more social contact with other adult group members and were more fully socially integrated into their groups were more likely than other females to rear infants successfully” (2003:1231). de Waal (2006) contends that chimp societies emphasize reconciliation and consolation after conflict; his 40 years of primate behavior observations have documented that concern for others is natural conduct for our closest primate relatives.

It appears that social animals are wired to cooperate and to reduce stress by seeking each others’ company. If cooperation and physical proximity among group-living animals are rewarding in a variety of environmental and social circumstances and if physiological and neurological feedback systems reinforce social tolerance and cooperative behavior, then social living can persist in the absence of any conscious recognition that material gains might also flow from mutual cooperation. Based on the latest research, friendly and cooperative behaviors provide psychological, physiological, and ecological benefits to social primates which are positively reinforced by hormonal and neurological systems.

On a more general level, in a recent volume, Weiss and Buchanan (2009) show that the conventional wisdom focusing on relentless competition as the primary mover of evolution is largely an artifact of a restricted view of evolutionary time scales. They provide ample evidence that evolution, development, and ecological interactions generally work on the basis of cooperation.

How can this research on cooperative behaviors apply to humans when we consider violence and war? There is a cultural acceptance in the West that humans are innately aggressive and that we characterize our aggressive feelings through violent actions. The general primate physiology does not support this view and leads instead to a belief that cooperation is innate to humans. Why the disconnect? Sometimes putting things in perspective is a helpful exercise. There are more than 6 billion humans alive today—all are social animals having constant interactions with other humans. The overwhelming majority of our 6-billion conspecifics are having days, weeks, even entire lives devoid of violent interpersonal conflicts. This is not to naively underplay crimes, wars, and state-level aggression found in modern times, but it puts them in the domain of the anomalous.

Murder rates vary greatly from nation to nation and from culture to culture (Chapter 12, this volume). Are war, crimes, and violence the genetic, unalterable norm, or might they be specific to stresses that occur when too many people want too few resources? After an exhaustive examination of ethnographic research on modern societies ranging from nomadic foragers to urban industrialized societies, Fry (2006; Chapters 13 and 14, this volume) documented the human potential for cooperation and conflict resolution. He stresses that virtually all early studies defining man by his capacity for killing appear to be flawed: “War is either lacking or mild in the majority of cultures!” (p. 97). Counter to assumptions of hostility between groups and among individuals and recurring warfare over resources, the typical pattern is for humans to get along rather well, relying on resources within their own areas and respecting resources of their neighbors. After an examination of the primary ethnographic information on nomadic foragers, Fry found the proposition that human groups are pervasively hostile toward one another is simply not based on facts but rather on “a plethora of faulty assumptions and over-zealous speculation” (2006:183). According to Fry, “Conflict is an inevitable feature of social life, but clearly physical aggression is not the only option for dealing with conflict” (p. 22). He summarized his findings by acknowledging the human propensity to behave assertively and aggressively but adamantly stating that just as inherent is the human propensity to behave prosocially and cooperatively, with kindness and consideration for others. Indeed, Fry’s work has convinced him that the very existence of human societies is dependent on the preponderance of prosocial tendencies over assertive and aggressive ones.

At another level the psychiatric research and clinical work of Cloninger (2004) has led him to the conclusion that individuals have the potential for either peaceful or violent behavior; a world view of connectedness (or cooperativeness) promotes peace, whereas separateness promotes violence. Furthermore, connectedness appears to be natural in the absence of abuse and defective development (see also Chapter 19, this volume). People are normally happy and content when they are cooperative (connected) but show hostility when they are alone and alienated.

We are not trying to ignore the role of aggression and competition in understanding primate and human social interactions. Our perspective, however, is that affiliation, cooperation, and social tolerance associated with long-term mutual benefits form the core of social group-living. Our earliest ancestors lived in a world populated by large, fearsome predators. Strong indications from the fossil record and living primate species led to the conclusion that hominins were regularly hunted and required social organization that promoted inconspicuous behaviors, minimal internal conflicts, and coordinated vigilance (Hart and Sussman, 2005, 2009; Treves and Palmqvist, 2007). What would have been the best strategy to avoid being eaten—conspicuous, violent interpersonal conflicts, or high levels of cooperation and reciprocity to facilitate as inconspicuous a presence as possible?

Is “Man the Hunter” and associated human violence the norm or the exception? Alternatively, is “Man the Hunted”—and the necessity for cooperation and altruism leading to human well-being—a more realistic view of the origin and nature of human sociality than the old paradigm of “Man the Hunter”? These questions lead us to ponder how new scientific theories or paradigms get accepted or, on the other hand, ignored? Unfortunately, the answer to this question may turn out to be much more political than scientific. In 1962, Kuhn wrote a classic book, The Structure of Scientific Revolution. In it he argued that scientists examine the evidence related to their questions and come up with the most parsimonious explanation that fits the data and techniques currently available at the time. However, the evidence is also filtered through a scientist’s own background and theoretical orientation by his or her world view and cultural milieu. Changing currently popular, engrained paradigms—those that have become “conventional wisdom,” like the “Man the Hunter theory”—is very difficult especially if the theory also fits standard cultural views of the world. Scientists, like most people, are generally conservative in their ability to adopt new paradigms.

Once a paradigm becomes established within a scientific community, most practitioners become technicians working within the parameters of the theory but rarely questioning the validity of the theory itself. In fact even questioning the theory is often thought of as unscientific because the new theory and the old are incompatible and the internal logic of each paradigm differs. Proponents of each paradigm are often talking past one another—speaking a different language. As expressed by Strum when she was trying to get primatologists to accept her observations that aggression was not as pervasive or important an influence on the evolution of baboon behavior as had been previously thought: “In science, according to Kuhn, ideas do not change simply because new facts win out over outmoded ones. Many more social, cultural and historical variables make up the complete picture. Since the facts can’t speak for themselves, it is their human advocates who win or lose the day” (2001:164).

So, yes, science is an accumulation of better and better evidence to fit a theory … or of finding that the old and new evidence is better accommodated by a completely new theory. And, in the end, even with new evidence and a better way of explaining it, ultimately, the politics of science must take its course. It is up to the audience to weigh the evidence. Discrepancies among the theories and the evidence must be evaluated. Once these discrepancies are seen to be overwhelming, the new paradigm will be accepted in favor of the old.

Science is not always truth. Science is just the best way to answer a particular question given the available evidence and technology at a particular time and place. At this time and place, we believe “Man the Hunted” as a paradigm of early human evolution best fits the currently available evidence.

There is little doubt that modern humans, particularly those in Western cultures, think of themselves as the dominant form of life on earth, and we seldom question whether that view also held true for our species’ distant past (or even for the present, outside of urban areas). Is “Man the Hunter” a cultural construction of the West? Belief in a sinful, violent ancestor does fit nicely with Christian views of original sin and the necessity to be saved from our own awful, yet natural, desires. Other religions do not necessarily emphasize the ancient savage in the human past; indeed, many modern-day hunter-gatherers who lived as part of nature until recent times, hold supernatural beliefs in which humans are a part of the web of life, not superior creatures who dominate or ravage nature and each other.

Think of our ancestors as prey, and you put a different face on our past. The shift forces us to see that for most of our evolutionary existence, instead of being violent or predaceous, we needed to live in groups (like most other primates) and work together to avoid predators. Thus, an urge to cooperate can clearly be seen as a functional tool rather than a Pollyannaish nicety, and deadly competition among individuals or nations may be highly aberrant behavior, not hard-wired survival techniques. Our earliest evolutionary history as a prey suggests that we should be able to take our ancestral tool kit of sociality, cooperation, interdependency, and mutual protection and use it to make a brighter future for ourselves and our planet.

We evolved as a mainly plant-eating species that also ate some animal protein collected opportunistically. But this latter activity did not make us a predator or a scavenger. We hunted but were not hunters, and we may have scavenged but were not scavengers. We are neither naturally aggressive hunters and killers nor always kind and loving. Humans have the capacity to be both. It is what we learn and our life experiences, our world view, and our culture that have the greatest influence on our behavior, even how we react to stress. That is exactly why it is necessary to comprehend that we have not inherited a “propensity” to kill derived from our hunting past. We are no more born to be hunters than to be gardeners. We are no more inherent killers than we are angels. We are, for the most part, what we learn to be.