Introduction

Guided by theoretical rationale derived from the prey choice model (PreyCM), the relative abundances of large to small-sized prey in zooarcheological assemblages are often used as important tools to measure relative prehistoric foraging efficiency (e.g., Broughton 1994, 2002; Butler and Campbell 2004; Wolverton 2005; Codding et al. 2010; Broughton et al. 2011). The PreyCM predicts dietary choice from an array of available resources ranked on a single dimension of profitability—the post-encounter return rate (kilocalories (kcals) obtained per unit of handling time)—and is generally viewed as a robust model for predicting resource choices among contemporary foragers (e.g., Winterhalder 1981; Hawkes and O’Connell 1985; Hill et al. 1987; Smith 1991). Working under the assumption that a forager’s goal is to maximize efficiency, resources fall in and out of the diet in rank order depending on the encounter rate(s) with high-value resources. The model provides an important theoretical context for interpreting the abundances of different prey in zooarcheological assemblages because body size is routinely viewed as a proxy measure for the post-encounter return rate of prey (e.g., Broughton et al. 2011). This assumption is based on empirical research showing that large-sized animals are often, but not always, higher ranked than those that are smaller in body size (e.g., Alvard 1993; Hill and Hawkes 1983; but see Madsen and Schmitt 1998; Bird et al. 2013).

The value of the PreyCM to zooarcheological interpretations is particularly well-demonstrated in the Great Basin where assemblages are often comprised artiodactyl and leporid remains. These prey types are commonly cast as reflecting opposing ends of the diet breadth with artiodactyls representing the highest value prey and leporids viewed as the lowest ranked prey. Accordingly, prehistoric increases in the abundances of small relative to larger-sized prey are considered to be signs of diminishing foraging efficiency arising from reduced encounter rates with high-ranked resources and linked to resource depression and/or habitat reduction from climate change (e.g., Szuter and Bayham 1989; Janetski 1997; Cannon 2003; Byers and Broughton 2004). Conversely, decreases in the abundances of small to large prey are often viewed as indicators of increasing environmental productivity as encounter rates with high-value prey rise (e.g., Broughton and Bayham 2003; Broughton et al. 2011).

Despite the obvious appeal of the PreyCM as an interpretive and predictive tool, an increasing number of empirical studies show that resource choice among ethnographic subsistence hunters is not always predicted solely by energetic returns and is often based on the trade-offs between risk and energy (e.g., Winterhalder 1981; Hawkes et al. 1991; Smith 1991, 2013; Lupo and Schmitt 2005; Lupo 2007; Bird et al. 2009, 2012; Codding et al. 2011). Risk is defined here as the probability of failure to acquire the target prey after it is encountered relative to other alternative resources. Important constraints arising from prey behavioral and physiological characteristics, such as mobility, predator avoidance, and defense responses, can appreciably increase the risk associated with pursing particular prey (Stiner et al. 2000, Stiner and Munro 2002; Lyman 2003; Koster 2007:98; Jones et al. 2008; Bird et al. 2009, 2012; Speth 2012; Wolverton et al. 2012; Lupo and Schmitt 2016). Furthermore, some prey, especially those that are mobile, have high pursuit costs which not only can increase the costs of pursuing the animal but also can lead to high failure rates. Often, the same prey with characteristics that make them difficult to pursue are larger-sized and presumably high-ranked. But failed and/or prolonged pursuits increase the costs of handling those prey and, by definition, decrease the post-encounter return rates associated with those animals (see Lupo and Schmitt 2016). Depending on the available hunting technology and pursuit strategy, large-sized prey with characteristics that make them difficult or expensive to pursue may be less efficient to acquire than smaller-bodied but lower-risk prey.

The ethnographic record shows that differences in prey characteristics can influence human predation. Hunters sometimes deliberately avoid pursing certain high-value prey because of the difficulty associated with its acquisition (see Lee 1979:231-234; Smith 1980:302-303; Yost and Kelly 1983:205-206; Lupo and Schmitt 2016). Conversely, hunters sometimes specifically target prey that are difficult to capture or that have a high risk of hunting failure relative to other available opportunities. Hunters may target these prey to enhance prestige, build social and/or political alliances, or gain mating opportunities (Hawkes et al. 1991, 2010; Sosis 2000; Hawkes and Bliege Bird 2002; Wiessner 2002; Smith 2004; Bird et al. 2009, 2012; Lupo and Schmitt 2016). These empirical observations do not invalidate the use of the PreyCM or the use of body size as a proxy for resource rankings, but show that the elevated risks and costs associated with the acquisition of certain prey can have an appreciable effect on prey rank. Clearly, these observations invite further questions about the ecological and social circumstances that might support the pursuit of high-risk prey.

Recently, Codding et al. (2011) (see also Bliege Bird et al. 2009) identified important ecological circumstances influencing the trade-offs between energetic returns and risk and how these articulate with the foraging goals of men and women. In circumstances characterized by unpredictable high-value resources or associated with high levels of daily variance, men who are more risk-prone than women may target high-risk prey with the goal of social provisioning. When men target high-risk prey, women often focus on more predictable resources with lower daily variances in return (or lower risk of failure) with the goal of provisioning (the so-called divergent strategies). Conversely, in biomes where many different high-value resources are predictably available and have a low risk of failure, the goals of men and women can overlap and result in coordinated acquisition strategies (the so-called convergent strategies). Mitigating factors include population densities, the availability of alloparents and social support, and the value of social networks, alliances, and prestige (also see Elston et al. 2014).

Leporids, artiodactyls, and foraging strategies in the Great Basin

Great Basin ethnographic and historic records show that indigenous hunter gatherers had divergent foraging patterns in which men targeted high-risk large prey and women focused on reliable low-risk resources that comprised the bulk of the diet (e.g., Elston et al. 2014). Large-bodied prey densities and, by extension, encounter rates were generally low (albeit geographically variable) throughout the region and smaller-sized prey, especially leporids, were a common prey item targeted by all segments of the population. The prehistoric paleoenvironmental record for the Great Basin, however, is characterized by dramatic changes in temperature and precipitation that influenced overall productivity and presumably prey abundances. Following the early Holocene, the middle Holocene (ca. 9000–4500 cal BP) was characterized by warmer temperatures and reduced precipitation that greatly reduced prey abundances and increased human population mobility (Madsen et al. 2001; Broughton and Bayham 2003; Byers and Broughton 2004; Madsen 2007; Broughton et al. 2011; Grayson 2011; Jones and Beck 2012). With the onset of the late Holocene approximately 4500 cal BP, cooler and moister conditions returned and likely increased environmental productivity and possibly the encounter rates with artiodactyls. Abundances of artiodactyl fecal pellets (measured as pellets per liter of sediment) from Homestead Cave, for example, show that the highest densities occurred some 3690–3330 cal BP (Hunt et al. 2000:52-53). Broughton et al. (2008, 2011) use these data, in concert with data from Hogup and Camels Back caves (Fig. 1), to argue for wetter summers and drier winters during the late Holocene that increased artiodactyl populations and fueled an increase in big game hunting and hunting efficiency in the Bonneville basin and much of the western USA. Hockett (2015), however, found that zooarcheological assemblages from Bonneville Estates Rockshelter and other cave sites showed sustained and stable artiodactyl hunting from the middle through late Holocene. He notes that artiodactyl hunting, as reflected by faunal abundances, remained stable through other notable climate perturbations, including the Neopluvial (3500–2650 cal BP), late Holocene drought (2600–1650 cal BP), and the Little Ice Age (650–100 cal BP) (e.g., Grayson 2006, 2011). He concludes that artiodactyls were always part of a very broad and diverse subsistence regime which varied with regional opportunities (Hockett 2015). In addition to climatic change, the late Holocene witnessed changes in hunting technology and pursuit strategies that may have influenced prey handling costs, risks, and social and economic values associated with artiodactyl hunting. These included the advent and spread of the bow and arrow some 2000–1400 years ago (Codding et al. 2010; Grayson 2011; Smith et al. 2013) and an increase in cooperative hunts/drives after about 5000 years ago associated with changes in sociopolitical organization and processes (Hockett 2005; Hockett et al. 2013).

Fig. 1
figure 1

The Great Salt Lake Desert area of the northern Bonneville basin showing the locations of projects and primary open and sheltered sites discussed in the text

While most researchers agree that localized conditions offered different sets of resource opportunities and constraints to prehistoric populations throughout the Great Basin, there is little consensus on the extent or scale of late Holocene increases in hunting productivity. It is also not clear if increases in artiodactyl hunting and foraging efficiency in the latest Holocene had an appreciable influence on the diet breadth and/or subsistence labor patterns of men and women. Increased big-game productivity during the late Holocene should lead to a more convergent labor pattern focused on a narrower diet breadth with a decreased exploitation of smaller-sized and lower value prey than observed in the ethnographic and historic records. Here we consider how the trade-offs between risk and energy influence classic Great Basin prey rankings and targets. Zooarcheological data from a large sample of Bonneville basin open contexts are considered in light of these trade-offs and together with additional data from neighboring Holocene-aged sites reveal a relatively stable pattern of artiodactyl and leporid exploitation.

Determining the trade-offs between risk and energy

The energetic values and post-encounter return rates for many of the different wild resources exploited in the ethnographic record of the Great Basin are well-established in the published literature. The most widely used of these sources is Simms’ (1984, 1985, 1987) pioneering data on the handling costs (pursuit and processing times) and benefits (as measured by kcals) of different resources. To determine the pursuit costs for artiodactyl encounter hunting, he used interviews with contemporary hunters who reported that pursuit varied from a few minutes to approximately 1 h (Simms 1984). For simplicity, Simms applied the same pursuit costs for deer (Odocoileus sp.), mountain sheep (Ovis canadensis), and pronghorn antelope (Antilocapra americana). Processing costs for deer and mountain sheep were estimated to be 1.5 and 1 h for antelope. For smaller-sized prey, such as leporids (hares (Lepus cf. californicus) and rabbits (Sylvilagus sp.)), he used the best estimates possible from limited ethnographic and wildlife literature and assumed 2–3 min pursuit after the animal was encountered. Processing costs for hares were estimated to be 5 and 3 min for rabbits. As Simms (1987) points out, for artiodactyls pursuit costs would have to be considerably higher to appreciably change the post-encounter rankings for these prey because of the high cost of processing large-sized carcasses. Accordingly, prey ranking is largely based on processing time which varies as a function of prey body size. Most notably, processing costs for artiodactyls are nearly the same as pursuit costs. Simms (1987:91) notes that doubling the pursuit for deer to 2 h only lowers the return rate from 17,971 to 12,580 kcal/h. Compared with a considerably lower ranked prey, such as duck, doubling the pursuit time changes the return rate from 1508 to 1231 kg/h. This exercise demonstrates just how dramatic differences in pursuit times can be on post-encounter return rates. A doubling of pursuit times results in a much larger change in post-encounter return rates for artiodactyls than it does for ducks (> 5000 versus < 300 kcal/h). Furthermore, given the limited nature of available data, Simms’ values do not include failed pursuits and the influence of risk from hunting failure on post-encounter return rates.

In traditional applications of the PreyCM to zooarcheological assemblages, similarly sized prey such as artiodactyls have assumed homogenous handling costs, including the probability of failure, and usually are treated as a group (e.g., Janetski 1997; Byers and Broughton 2004; Ugan 2005; Broughton et al. 2011). However, while all the species that comprise artiodactyls (and leporids) are mobile, they often occupy different (albeit sometimes overlapping) habitats, move at different speeds, and, more importantly, have very different predator defense mechanisms (Table 1). Among common artiodactyls in the Bonneville basin, for example, pronghorns are the fastest animal in the Western Hemisphere reaching speeds of over 100 km/h and are known for their superlative aerobic capacity that allows for prolonged long-distance running up to 5 or 6 km before becoming exhausted (e.g., Lindstedt et al. 1991; Lubinski and Herren 2000). In contrast, mountain sheep can reach about 50 km/h on flat ground but only 15 km/h on broken terrain and escape predators by using landscape obstacles such as steep and rocky cliff faces (Valdez and Krausman 1999; Shackleton 1985). Similarly, the pursuit costs of leporids can greatly differ given their antipredator responses and preferred habitats (Table 1). These differences in mobility, predator defense strategies, and other features could potentially translate into vastly different risks of hunting associated with respective prey.

Table 1 Body mass, maximum speed, and antipredator behaviors of some Great Basin mammals

While it is challenging to know how the elevated pursuit costs and failure rates of all of the different animals that comprise artiodactyls and lagomorph influenced prehistoric hunter success, recent analyses of a large set of empirical data derived from contemporary subsistence hunters identify several important trends (Lupo and Schmitt 2016). Among subsistence hunters using a wide range of traditional hunting technologies and pursuit techniques including spears, bows and poisoned arrows, and bow guns, hunting success (as measured by the number of times a hunter kills and acquires and carcass divided by the number of times the prey was encountered and pursued on the landscape) is negatively correlated with prey body size (Fig. 2). The pursuit of smaller-sized prey is generally (but not always) associated with higher hunter success than larger-bodied prey which often have longer pursuit times and higher hunting failure rates. In this sample, large-sized prey includes fleet artiodactyls and other animals that are dangerous or simply difficult to kill with traditional technologies (see Lupo and Schmitt 2016).

Fig. 2
figure 2

Relationship of hunting success to prey mean body weight (after Lupo and Schmitt 2016: Fig. 2)

Unfortunately, the pursuit costs and specific risks of failure associated with different prey in the Great Basin are unavailable. Sparse ethnographic and ethnohistoric descriptions are illustrative of the range of techniques used to pursue certain species, such as antelope and jackrabbits, but few report quantitative data on the costs or failure rate of these pursuits (see Lubinski and Herren 2000; McCabe et al. 2010). For example, large numbers of antelope and jackrabbits could be taken in communal drives that were held seasonally, required a large organized labor force, and likely involved large investments of time and effort (e.g., Hockett et al. 2013). Far less information is available on encounter hunting of individual animals. In the case of antelope, this could involve substantial time investments in wearing a disguise and/or stalking. Lowie (1909:185) described mounted northern Shoshoni pursuing antelope and reported that 40 or 50 mounted hunters could spend half of a day to kill 2 or 3 animals. Another common, but high cost pursuit method involved persistence hunting, which could last 2 days before the animal was dispatched (McCabe et al. 2010:61). Similarly, many different dispatch methods including communal drives with and without nets could be used to dispatch hares, but other methods include snares, hand capture, clubs, arrows, and rabbit throwing sticks.

Even less information is available on hunting failure rates related to the pursuit of prey. In a novel attempt to estimate the risk of hunting failure, Broughton et al. (2011) cited survey data collected from contemporary firearm hunter’s in South Carolina and Kentucky pursuing cottontails (Sylvilagus sp.) and white-tailed deer (Odocoileus virginianus) hunters in Ontario and South Carolina. As they acknowledge, these data are clearly not directly comparable with the success rates of prehistoric hunters but are only illustrative of the potential degree of risk associated with pursuing different prey. Contemporary hunters use a variety of different modern weaponry (rifles, improved bow and arrows, trained dogs, etc.), hunt in designated areas, and have a single-prey foci as dictated by tags/licenses, and much of the available quantitative data are derived from self-reported surveys which are often inaccurate (e.g., Lukacs et al. 2011). Broughton et al. (2011) cite data reporting a modest success rate of 56% (42–62%) based on the number of reported rabbits seen jumping by hunters and number that were subsequently killed. By this measure, rabbit hunting appears to be a very high-risk pursuit. However, the number of rabbits jumped in these surveys does not represent the number of animals actively pursued by hunters and reflects only the densities of rabbits on the landscape. Footnote 1 For artiodactyls, they cite reports on overall hunting success of 79–80% for O. virginianus with an estimated failure rate of 20% from an experimental hunt conducted in enclosed and heavily managed hunting club. These values are based on the general hunter success rate and not success as a function of the number of animals killed from those encountered and seem to show that hunting large-sized artiodactyls is a low-risk strategy in comparison with pursuing rabbits.

Here we follow Broughton et al. (2011) and dig a bit deeper into the published literature on hunter success and failure as reported in the wildlife literature. Two different measures can provide insights into hunting success and the risk of failure. In most of the published wildlife literature, overall hunting success is based on measures of whether or not a hunter made a kill at some point during a given interval, irrespective of how many animals were encountered. More accurate measures of hunting success should include data on how often the hunter dispatches an animal after it is encountered (number of animals dispatched/number of animals encountered and pursued). However, measurements of hunting success based on the number of animals killed given the number pursued are very rare in the available literature. The closest approximation can be made from available data on wounding or crippling rates, which provide some insight into the risk of hunting failure given the number of prey encountered. Wounding rates measure how many animals were shot by the hunter but either escaped and recovered or eventually died from of their wounds, but the carcasses were never found. Data from the South Carolina and Kentucky hunting surveys mentioned above show relatively low rabbit wounding rates of approximately 2%. This is because the rabbits were dispatched with guns which inflict traumatic injury, but the low wounding rates also suggest that hunters did not often miss their target after it was selected. In a separate controlled study targeting European rabbits (Oryctolagus cuniculus), Hampton et al. (2015) report a high success rate of 79% and found that of the animals targeted by hunters, about 12% were wounded and another 9% escaped unharmed. Comparable accurate wounding rates for deer are difficult to find but are reportedly much higher—between 40 and 60%—especially for bow hunters (Croft 1963; Downing 1971; Garland 1972; Stormer et al. 1979; McPhillips et al. 1985; Boydston and Gore 1987; Ditchkoff et al. 1998). Lower wounding rates of 7–18% are reported, but these are either associated with highly modified bows and/or enclosed hunting areas such as managed clubs or islands where numerous hunters participated in organized hunts (Severinghaus 1963; Gladfelter et al. 1983; Krueger 1995; Ruth and Simmons 1999; Pedersen et al. 2008). Despite the shortfalls in these data, lower failure rates are associated with hunting leporids in comparison with artiodactyls.

More quantitative data are available for overall hunting success rates for contemporary hunters of large artiodactyls and leporids. Here, we use overall hunting success rates of gun hunters in California spanning some 12 years (Fig. 3). These data show that hunters who pursued cottontail rabbits and hares were fairly successful over this interval and significantly more successful than those reported by Broughton et al. (2011). Although the values simply reflect whether a hunter was successful irrespective of the number of animals they encountered and pursued, the values are strikingly different. In general, these gross measures show that deer hunters are far less successful than those targeting cottontails and jackrabbits. While all these taxa are mobile and leporids are likely more abundant on the landscape than artiodactyls, they also present a much smaller-sized target and would presumably more difficult to hit than deer, especially with modern weaponry.

Fig. 3
figure 3

Overall hunting success rates for cottontails, jackrabbits, and deer in California, 1996, 1999–2008, and 2010 (California Department of Fish and Upland Game/Waterfowl Program (https://www.wildlife.ca.gov/hunting/harvest-statistics))

To evaluate how risk of hunting failure could potentially influence the rankings of different prey, we recalculated the post-encounter return rates as reported by Simms (1984) using overall hunting success. We follow the modification suggested by Ugan and Simms (2012) of discounting the post-encounter return rate by the probability of a failed pursuit (Fig. 4). When post-encounter return rates are discounted by failure rates derived from contemporary hunters, the ranking of prey changes (see Lupo and Schmitt 2016) and smaller-sized prey with lower risks of hunting failure become more efficient choices. Clearly there are circumstances where the high risk of failure can make smaller-sized and low-risk prey more efficient than large-sized prey. While it is impossible to know the actual risks of failure faced by prehistoric hunters, these data can shed light on subsistence patterns that appear to run contrary to general predictions of the PreyCM.

Fig. 4
figure 4

Box plot showing post-encounter return rates for deer and hares. Unadjusted rates are from encounter hunting as reported by Simms (1984) and adjusted encounter rates have been discounted to reflect failure rates (see Fig. 3)

Prehistoric Bonneville basin environs and human subsistence

A general overview of regional basin and range topography shows that Great Basin habitats are characterized by elevational zonation (e.g., Grayson 1993, 2011; Harper 1986). These include sparsely vegetated xerophytic scrub communities located on valley floors and lower piedmonts, pygmy forests of juniper (Juniperus sp.) and pinyon pine (Pinus edulis) on the lower mountain slopes, and subalpine forests of aspen (Populus tremuloides) and limber pine (Pinus flexilis) at higher elevations. In the eastern Great Basin, the Bonneville basin is a massive Pleistocene lake basin that covers portions of southern Idaho, eastern Nevada, and much of western Utah, including the Great Salt Lake and Great Salt Lake Desert (e.g., Madsen 2000) (Fig. 1). Piedmonts, extensive alluvial fans, and broad valleys comprise most of the region which currently support open xerophytic plant communities, and regional paleoecological studies of floral and faunal remains agree that these contexts supported similarly open and arid habitats throughout most of the Holocene (Grayson 2000, 2011; Madsen et al. 2001; Louderback and Rhode 2009; Schmitt and Lupo 2012, 2016; Rhode 2016). Among other species, these vast open tracts provided ideal environments for leporids and artiodactyls, at times including bison (Lupo 1996; Grayson 2006), as well as excellent hunting opportunities for regional peoples (Schmitt et al. 2004). Jackrabbits are particularly well-suited for low-elevation arid habitats and are common in most areas, pronghorn also favor open brush/grass communities on valley floors and lower foothills and co-occur with hares, and deer often occur in valleys and along mid-level slopes and canyons with sage and forest communities that include both hares and cottontails (e.g., Hall 1946; O’Gara 1978).

To investigate regional prehistoric subsistence pursuits, zooarcheological data from various groups of sites in the southern Great Salt Lake Desert are examined. First, Table 2 presents jackrabbit and artiodactyl assemblages as quantified by the number of identified specimens (NISP) from excavated and dated cultural features across the region. Overall, 22 dated contexts with associated food residues are reported and include hearths and/or occupation surfaces at Buzz-Cut Dune (Madsen and Schmitt 2005), Camels Back Cave (Schmitt and Madsen 2005), Playa View Dune (Simms et al. 1999), and a late prehistoric/protohistoric occupation at 42To567 (Rhode et al. 2011) (Fig. 1). Note that most assemblages contain considerably more jackrabbit bones than those of artiodactyls. Cumulatively these collections mark hare processing episodes that span more than 8000 years with a number of contexts dating between about 5600 and 4400 cal BP. With the exception of an ephemeral hearth (Feature 66) in Camels Back Cave and subsequent living surface (Feature 25, ~ 750 cal BP) where appreciable numbers of identified artiodactyl and artiodactyl-sized specimens were deposited (Schmitt and Lupo 2005), bone assemblages from these various cultural contexts are dominated by jackrabbits with artiodactyl remains being few, or in 11 cases, entirely absent (Table 2).

Table 2 Chronological data and numbers of identified hare and artiodactyl specimens from cultural features in regional excavated sites

Second, and less well known, are numerous surface assemblages documented in regional archeological surveys that provide further evidence for the presence and recurrent dominance of hares in local subsistence systems. Table 3 presents presence-absence data on observed jackrabbit and artiodactyl remains in 65 open sites recorded in survey projects along the southern margins of the Great Salt Lake Desert. Most survey areas (AFUA, Loiter, TAE, T&T, and White Sage) largely encompassed dune deposits along the toes of alluvial fans, but a few (Tess 1, 5, and 7) were atop the flat, sparsely vegetated cap of regressive phase lacustrine fines in the bottom of Dugway Valley (Madsen et al. 2015), and one (BSP; Fig. 1) incorporated dunes and deflated alkali mudflats (Page et al. 2014). To control for taphonomic and associated site formational issues, we note that the bone was typically found in direct association with fire-altered rock that includes both discrete concentration loci and eroded scatters. Furthermore, given the potential presence of occasional on-site jackrabbit natural death assemblages and especially fragmentary hare remains deposited in carnivore scatological droppings (e.g., Schmitt and Juell 1994), only severely burned (carbonized and/or calcined) specimens were considered human food residues (cf. Byers and Broughton 2004). The observed assemblages ranged from a couple of charred bones to hundreds of carbonized and calcined fragments. In a number of occurrences, the charred bone and fire-altered rock clusters contained pieces of woody charcoal and/or oxidized sediment and likely served as cooking features or refuse dumps. Some burned bones were observed in association with small artifact assemblages (e.g., a few flakes and one or two tools) that likely manifest briefly occupied task sites and camps, and others were found in large scatters containing ground stone, ceramics, and large and diverse assemblages of lithic tools and detritus (Table 4) that doubtless mark prolonged stays by family groups. Although these site data only afford surface expressions, 84 of the 85 loci with associated human food refuse contain jackrabbit/jackrabbit-sized bones (Table 3), and in all but two contexts, jackrabbits are the only species present.

Table 3 Numbers of sites and cultural features containing burned bone by survey project area
Table 4 Chronological data and associated surface artifact assemblages at Bonneville basin open sites containing carbonized/calcined bone

Twenty-one of these surface bone assemblages occur with temporally diagnostic projectile points and/or ceramics, and there are radiocarbon age estimates on charcoal from associated fire-cracked rock features at eight Tess 1 sites (Schmitt et al. 2010) (Table 4). Overall, the types of associated artifacts and the results of radiocarbon assay mark occupations dating to the past ~ 2000 years and include a large number of sites dating to the Fremont Period (~ 1800–500 cal BP; e.g., Madsen 1989; Madsen and Simms 1998; Simms 2008:185–228) and extending into the Late Prehistoric times. Together with the dated excavations (Table 2), there are 106 episodes of jackrabbit processing along the margins of the southern Great Salt Lake Desert, with one site containing only artiodactyl remains and 94 (89%) marking hare-only processing events.

Fifty-one contexts with chronological data illustrate the continued use of hares over the last ~ 8300 cal years, including a mass collecting event(s) near Camels Back Cave ~ 7400–7250 cal BP (Table 2; Schmitt et al. 2004). Importantly, clusters of dated use episodes occur during two very disparate climatic events, with the first increase in hare processing features dating to ~ 5600–4400 cal BP during the later years of middle Holocene desertification (e.g., Madsen 2000; Grayson 2011). This cluster includes occupations at Buzz-Cut Dune (Madsen and Schmitt 2005) and Playa View Dune (Simms et al. 1999) and multiple visits to Camels Back Cave (Schmitt and Madsen 2005). While the pursuit of jackrabbits during this arid interval may suggest resource intensification, hares dominated regional human refuse aggregates prior to this time, and it is more plausible that these occurrences mark procurements of a low-risk dietary staple.

The second concentration of dated contexts is during Fremont times where foragers and farmers commonly took hares during a climatic cycle marked by increases in summer moisture that included monsoonal storms and the associated expansion of grassland habitats (Madsen 2000; Wigand and Rhode 2002; Grayson 2011). These novel grasslands provided propitious forage for both hares and large herbivores, including bison, whose populations expanded significantly in some areas as a result of these improved environmental conditions (Lupo 1996; Grayson 2006; Broughton et al. 2008). In fact, a number of Lepus-only/dominant Fremont assemblages in the southern Great Salt Lake Desert were deposited while neighboring foragers and farmers in lake margins along the Wasatch Front were taking significant numbers of large mammals (e.g., Lupo and Schmitt 1997; Grayson 2006 and references therein), as were the inhabitants of Oranjeboom Cave (Buck et al. 2002). Thus, during a time when regional subsistence pursuits should have forsaken hares and focused on artiodactyl prey, many people continued to rely on jackrabbits as a food resource (e.g., Hockett 1998). During the Fremont Period at Camels Back Cave, it appears that artiodactyls and hares were pursued in tandem (Table 2), as there was a marked increase in fragmentary large mammal remains that included bison and bighorn associated with Lepus bones processed by human hunters (Schmitt and Lupo 2005).

In a final and more far-reaching look at regional subsistence strategies, we incorporate temporal and quantitative data on leporids and artiodactyls from neighboring eastern and central Great Basin sites with the southern Great Salt Lake Desert data (Online Resource 1). Included are skeletal abundances in stratigraphic aggregates from Swallow (Dalley 1976; Swanson 2011) and James Creek (Grayson 1990) shelters and Hogup (Durrant 1970; Martin et al. 2017) and Danger (Grayson 1988) caves (Fig. 1), and bone collections from open Fremont residential sites along the Great Basin’s easternmost edge (e.g., Sharrock and Marwitt 1967; Marwitt 1970). Each of these sites occurs in habitats that support both leporids and artiodactyls and we do not include the aforementioned artiodactyl-rich Fremont and Late Prehistoric sites unique to the wetlands along the margins of the Great Salt and Utah lakes.

With the addition of these assemblages, there are now 119 dated bone aggregates from 51 sites. The context of each assemblage and its age and abundance index are presented in Online Resource 1, and a scatterplot of the abundance indices through time are presented in Fig. 5. Our use of mean age estimates and calculations of abundance indices follow previous measures used by researchers across the region (e.g., Broughton 1994; Janetski 1997; Byers and Broughton 2004). In a few instances, multiple stratigraphic bone aggregates were bracketed by widely distributed age estimates and were assigned mean ages based on the number and position of stratigraphic horizons and the span of the bracketing dates. Except for minimum number of individual counts on the Hogup Cave specimens (Durrant 1970), all temporal bins were quantified using NISP, and the abundance indices represent artiodactyl indices (e.g., Szuter and Bayham 1989) calculated as Σartiodactyls/Σartiodactyls+lagomorphs to track differences in the ratio of large-bodied herbivores to smaller hares and rabbits. While the data suggest a very slight increase in artiodactyl abundances through time, the relationship is not significant (Pearson’s correlation coefficient; r = 0.092, p = .320, df = 117) and the low and flat trendline (Fig. 5) illustrates the persistence, indeed importance, of hares and rabbits in regional prehistoric subsistence systems as they dominate most assemblages (mean of artiodactyl index values = 0.24). In fact, 39 assemblages contain only leporid remains and 92 of the 119 aggregates (77.3%) contain more leporid bones than those of artiodactyl.

Fig. 5
figure 5

Scatterplot of 119 eastern and central Great Basin artiodactyl indices (Σartiodactyl/Σartiodactyl+lagomorpha) through time

Summary and discussion

A review of the trade-offs between the energy and risk associated with the acquisition of artiodactyls and leporids suggests that hunting failure can be a significant factor influencing post-encounter return rates and traditional prey rankings (also see Lupo and Schmitt 2016). It might be argued that the published post-encounter return rates that underlie resource rankings are generalized estimates that represent most circumstances of capture and processing and are not meant to address all possible outcomes. But published post-encounter values do not encompass the costs of hunting failure, and as demonstrated here and elsewhere, when these values are discounted to account for the failure, the rank ordering of different sized prey is significantly altered. Arguably, the use of modern hunting success/failure rates as estimates for discounting post-encounter return rates is a very blunt tool and more exacting data on the pursuit costs of different prey and failure rates are needed. But these data illustrate the salient point that measurements of risk need to be incorporated into the prey rankings that are commonly used to interpret archeological data.

The PreyCM is a robust tool for predicting subsistence choices, but some of the original empirical applications of foraging theory among hunters and gatherers produced quantitative evidence challenging the underlying logic of the model (Hames 1979; Winterhalder 1981; Hill et al. 1987; Lupo 2007). Early applications of the model to the ethnographic record identified the potential influence of nonenergetic currencies on resource choice. Hill et al. (1987), for example, presented data showing that energetic efficiency does not always predict resource choice among the Aché, and that differences in macronutritional composition between resources might be an important factor guiding food choice. Additionally, numerous ethnographic observations demonstrate that some hunters consistently pursue high-risk, costly, or seemingly wasteful hunting opportunities even when less costly and more reliable alternatives are available (e.g., Hawkes et al. 1991; Bliege Bird et al. 2001; Lupo and Schmitt 2002; Bird et al. 2009). These pursuits are viewed as costly signals aimed at enhancing non-consumptive benefits such as attracting political alliances, friendships, and other sociopolitical advantages. Moreover, there are a variety of different circumstances where hunter gatherers intentionally alter their environments (e.g., using fire) in ways that enhance or maintain productivity of certain resources, including those that would be considered low ranked (e.g., Winterhalder and Lu 1997; Smith and Wishnie 2000; Bird et al. 2005).

The prehistoric acquisition of artiodactyls and leporids in the Great Basin provided additional benefits beyond energy that potentially influenced the valuation and subsequent rankings of these prey. In ethnographic and historic records of indigenous peoples, artiodactyls and leporids are frequently mentioned as prey exploited for protein, hide, and other products such as bone and teeth (e.g., Simpson 1876; Steward 1938; Lowie 1939; Downs 1966; Fowler 1992). Among these, hides and pelts may have been the most important non-consumptive resource, especially for their use in garments and coverings. Beyond their value as a low-risk consumable resource, hares seasonally provided pelts that were used to manufacture essential clothing, robes, and/or blankets used by native peoples throughout the Great Basin and elsewhere. Jackrabbit drives to procure meat and pelts also figured prominently into fall festivals that often coincided with the harvest of other important resources, but also had a substantial social component(s) as participation in these hunts was likely an entrée into these larger events (Steward 1938). Artiodactyls were used for the same products and in similar contexts. Cooperative antelope drives, for example, were highly organized events associated with festivals that provided opportunities to gain hunting prestige. However, the value and use of the skin for garments differed between artiodactyls, such as deer, and jackrabbits. Steward (1941:245) reports that a high social value was placed on tailored skin clothing such as hide leggings and shirts made from larger-sized skins. For men, these items were an “advertisement of the man’s industry and skill as a hunter, thus affording slight prestige value” (also see Lowie 1924:217–218; Kelly 1932:106). For women, Steward (1941:245) states, “The most pretentious woman’s garment was a long gown made of two skins…which represented affluence and was preferred in the winter.” Larger-sized and high-quality artiodactyl skins were highly valued, especially in parts of the Great Basin where they were difficult to find (Kelly 1964:45; Steward 1938:45). In contrast, jackrabbit pelts were twined and fabricated into blankets and robes which were considered an essential item for everyday lifeFootnote 2 (see Palmer 1897:68; Gilmore 1953). Interestingly, Steward (1941:245) noted that “poor or unlucky hunters only wore breechclouts but still had a robe.” Twined jackrabbit ropes made from pelts were used as a type of currency and could be sold for cash or exchanged for fine buckskinsFootnote 3 (Steward 1938:45). Rabbit skin robes or blankets were usually custom-made, curated, and highly valued by their owners as a versatile garment (Palmer 1897). Experimental studies show that rabbit pelts have superlative insulative and thermal properties that likely made them indispensable during the winter months (Yoder et al. 2005). Thus, while deer and jackrabbits both yielded skins, buckskin clothing was often considered a marker of prestige, while jackrabbit robes were viewed as essential garments.

In this analysis, we show that hare bones dominate prehistoric human subsistence detritus even when large-bodied ungulates were available and their encounter rates were ostensibly increasing. Beginning with the Lepus bone refuse deposited by the initial late Pleistocene human foragers at Bonneville Estates Rockshelter (Hockett 2007, 2015), hares dominated Bonneville basin subsistence assemblages and were a common and fundamental part of people’s lives. If hunting high-risk artiodactyls was a pursuit that conferred prestige and low-risk leporids provided a more predictable return, these circumstances would support a diversified labor system similar to the organization of labor reported in the ethnographic record. In the ethnographic and historic records of the Great Basin, women are described hunting/collecting a variety of small prey as part of their subsistence regime (e.g., Ferris 1940:267; Fowler 1986; Fowler and Walter 1985; Leonard 1904:119). For example, Northern Paiute and Uintah Ute women, as well as children and adolescents, trapped small mammals (Fowler 1989; Steward 1970:138–139). Hares and other small mammals were not only targeted by women and children as the ethnographic record also mentions men as hunting small game, and communal drives where large numbers of hares (and other animals) were synchronously acquired often employed all available workers (e.g., Lowie 1939; Steward 1970; Fowler 1986). Conversely, artiodactyl hunting was largely pursued by men with the exception of communal hunts in which women, children, and the elderly assisted in capturing and especially processing and transporting carcasses (see Kelly 1964; Stewart 1941). While it is difficult to ascribe task group composition to particular archeological remains, some of the sites and jackrabbit remains reported here may have been the result of hunting by women and children. As noted above, some of the open sites contained ground stone and ceramics that were likely related to women’s subsistence activities, but it is also probable that many of these site assemblages represent the acquisition of a staple resource by any and all members of the population.

Finally, current approaches to interpreting prehistoric prey acquisition focus on estimating the encounter rates with (and relative abundances of) high-value prey such as artiodactyls from limited fecal and skeletal data collected from sheltered contexts (see Grayson 2011). All zooarcheological assemblages are influenced to some degree by taphonomic processes and many of the well-dated and available assemblages have not undergone systematic taphonomic analysis. Even more problematic is the fact that, with few exceptions, most of the available sites with well-dated faunal assemblages are from sheltered contexts where the abundances of prey bones may be tied to the frequency and degree to which those sites were used by people, but may not reflect the abundances of the prey on the landscape (Grayson 2011; also see Speth 2012). Speth (2012) has noted that the frequent disparities in faunal abundances between synchronous occupied cave and open sites may be revealing different aspects of the diet breadth. A good example of this phenomenon is reflected in the Fremont Period subsistence residues from Camels Back Cave; while burned and broken jackrabbit bones are present, the assemblage contains abundant artiodactyl bones (Table 2) while neighboring open sites dating to the period contain only the remains of hares. In the Bonneville basin, virtually all interpretations of artiodactyl abundances are based on cave sites, especially Hogup and Homestead caves. One of the few exceptions is from the Little Boulder Basin sites with occupations spanning the last 3000 years in the Humboldt River drainage which is located some 300 km away from the Bonneville basin (Broughton et al. 2011). Data from open contexts reported here greatly expand the available data set on faunal abundances and reveal a very different dimension of subsistence and the diet breadth.

Conclusions

In the prehistoric Bonneville basin, jackrabbits were exploited throughout the Holocene and were a dominant meat source during middle Holocene desertification and particularly the more mesic Fremont Period where they provided nutrition to expanding populations (e.g., Madsen and Simms 1998). Considering the longitudinal pattern of exploitation spanning the last 8300 years, the wealth of hare-rich Fremont faunal assemblages reflects the continuation of a stable exploitation pattern and not necessarily declines in foraging efficiency. Rather, the hare was an integral part of everyday life for regional peoples that provided food, adornment, and vital warmth, and that doubtless served as the center of many conversations and familial ties.

Abundances of artiodactyls and leporids are often interpreted within the context of the PreyCM, a useful quantitative and highly flexible analytic tool with demonstrable explanatory power. The PreyCM is one of the several broad, evolutionary-based models that allow researchers to change goals, constraints, and currencies (Bird and O’Connell 2006; Lupo 2007; Codding and Bird 2015). As such, the model has the potential to illuminate many different aspects of human foraging behavior. We believe that future applications of the model to the zooarcheological record require additional data. These data should be generated through experimentation and/or simulation modeling on the risks and pursuit and processing costs of acquiring large and small game using different hunting technologies and techniques. Comprehensive ecological models could also provide temporal views of taxonomic abundances on regional landscapes. Moreover, and while some assumptions necessarily remain, zooarcheologists need to carefully consider taphonomic processes and quantitative methods (e.g., Schmitt and Lupo 1995; Hockett 1996; Cannon 2013; Fisher 2018) and examine age profiles and human processing patterns to infer the capture technique(s) (see especially Jones 2006). It is time for researchers to revisit exactly what is being measured in these applications and how those measurements are derived.