Introduction

The inability to recognise and/or express effective anti-predator behaviour against novel predators as a result of lifetime or evolutionary isolation from predators is known as ‘prey naiveté’ (Goldthwaite et al. 1990; Carthey and Banks 2014). A lack of predator recognition to introduced, novel predators is the most damaging form of naiveté as prey are unable to mount effective anti-predator responses (Cox and Lima 2006; Ferrari et al. 2015).

Natural selection favours prey species that are able to successfully detect, identify and appropriately respond to predators prior to their attack, increasing their probability of escape and/or avoidance of a predator (Monclús et al. 2005). However, not all species or even individuals are able to accurately recognise a predator. Anti-predator behaviour may be innate (genetically based), be learnt through experience or be a combination of the two (Jolly et al. 2018). In many species of mammals (Owings and Owings 1979; Fendt 2006), birds (Göth 2001) and fish (Berejikian et al. 2003), predator recognition is an innate trait. Despite years, decades or even thousands of years of isolation from predators, some prey species retain predator recognition skills of their ancestral predators (Blumstein et al. 2008; Li et al. 2011; Steindler et al. 2018). However, for many other prey species, learning and experience are necessary to properly develop and perform appropriate anti-predator behaviours (Griffin et al. 2000). Prey that are able alter their behavioural patterns in accordance with their learnt experiences are expected to be at a selective advantage in response to potential predation threats from introduced predators (Maloney and McLean 1995; Kovacs et al. 2012).

The ‘learned recognition’ hypothesis suggests that through lifetime experience with predators, naïve prey may enhance their ability to recognise and respond to predators (Turner et al. 2006; Saul and Jeschke 2015). Failure to recognise and appropriately respond to a predation threat increases the risk of a fatal encounter with predators (Chivers and Smith 1995). As such, prey that are able to alter their behaviour in accordance with learned information are expected to have a greater degree of flexibility in their response to the risk of predation (Brown and Chivers 2005). The development of learnt anti-predator recognition skills towards evolutionary and/or ontogenetically novel predators has been shown in a broad array of taxa including fish (Ferrari 2014), birds (Maloney and McLean 1995) and mammals (Griffin et al. 2000).

How long it takes to learn predator recognition of previously novel predators depends on the prey species and how readily adjustable they are to novel interactions (Cox and Lima 2006). Studies have found that despite over 200 years of coexistence with introduced predators, some naïve species are yet to evolve the appropriate anti-predator risk assessment responses (Hayes et al. 2006; McEvoy et al. 2008). In contrast, other studies have found that 200 years or less is sufficient to learn, develop and select for appropriate predator recognition skills (Maloney and McLean 1995; Banks et al. 2018). Consistent with this idea, a global meta-analysis on factors influencing expression of prey naiveté found that naiveté declined with the number of generations since predator introduction (Anton et al. 2020).

The introduction of novel predators has caused significant damage to native prey populations, particularly in areas where prey species may be considered ‘naivé’ (Cox and Lima 2006) and is a major contributing factor to failed reintroduction attempts of locally extinct species (Moseby et al. 2016). With an increasing reliance on ‘safe-havens’ such as predator-free islands and fenced reserves for threatened species recovery programmes, we need to develop a better understanding of the role that lifetime experience with predators plays in the development of appropriate anti-predator responses (Legge et al. 2018). Indeed, there is concern that isolation from all predators may prohibit predator-driven natural selection processes, preventing a ‘future beyond the fence’ for threatened species reintroductions (Moseby et al. 2016; Jolly et al. 2018).

The bilby (Macrotis lagotis) is an omnivorous, burrowing, medium-sized (body weight 1.5–2.5 kg), nocturnal marsupial that was once widespread in Australia (Burbidge and Woinarski 2016). In the last 150 years, bilbies have undergone a severe range decline which has been attributed in part to naiveté towards introduced predators, the red fox (Vulpes vulpes) and feral cat (Felis catus) (Burbidge and Woinarski 2016). A study of wild bilbies living within the ‘Arid Recovery’ predator-free fenced reserve in South Australia found that bilbies with no ontogenetic exposure to mammalian predators recognised the scent of a native predator, the dingo (Canis familiaris), which they have shared over 8000 years of co-evolutionary history (Zhang et al. 2020), but did not recognise the scent of a recently introduced predator, the feral cat (Steindler et al. 2018). The bilbies inhabiting the Arid Recovery safe-haven were considered to be wild, because they were not supplementary fed and were exposed to avian and reptilian predators (Steindler et al. 2018). These findings suggest that bilbies have innate recognition of dingoes, but not feral cats, and that a prey species’ ability to respond to the odour of their predators scales with the duration of their evolutionary coexistence (Peckarsky and Penton 1988).

In this study, we investigate the recognition of predator scents by a remnant population of bilbies that were coexisting with dingoes, feral cats and rabbits in south-west Queensland. In particular, we were interested to evaluate whether non-safe-havened bilbies were naïve to the scent of feral cats like the ontogenetically predator naïve population within the Arid Recovery safe-haven (Steindler et al. 2018) or had developed recognition of the scent of feral cats. If the latter, we expected that bilbies should be more wary when both cat and dingo/dog faeces are present compared to a herbivore (rabbit, Oryctolagus cuniculus) and experimental control (no odour). Non-safe-havened bilbies would be at selective advantage if able to successfully detect, identify and respond to both predators with which they co-exist. Predator recognition could be due to either learned recognition of the threat posed by cats through their lifetime or strong natural selection imposed by cats over evolutionary time. However, if bilbies recognised dingo/dog faeces but not cat faeces, it would suggest that bilbies responses towards predators are constrained by their period of evolutionary coexistence and bilbies remain ‘naïve’ to introduced feral cats.

We used faecal samples as they are a useful indicator of predator presence (Hayes et al. 2006) and provide prey with information regarding predation risk, even when a predator is absent at the time of detection (McEvoy et al. 2008). Bilbies in south-west Queensland have shared more than 8000 years of co-evolutionary history with dingoes (Zhang et al. 2020), less than 140 years with feral cats (Abbott 2002) and less than 130 years with European rabbits (Zenger et al. 2003). Bilbies are able to produce a litter of 1–3 young, four times a year and have a captive longevity of 5 to 9 years (Southgate et al. 2000). Based on the reproductive rate of the bilby, we made the assumption that the bilby population in south-west Queensland has gone through 44 generations potentially living with and exposed to predation by feral cats over the past 140 years.

Materials and methods

Study area

We studied wild bilbies across 21 nights in October 2016 within Astrebla Downs National Park, Queensland (Fig. 1; 24° 12′ 24.60″ S, 140° 34′ 5.39″ E). Astrebla Downs National Park is located in the Channel Country, a region consisting of flat to undulating erosion plains dissected by minor drainage lines. The vegetation of the Channel Country is dominated by barley Mitchell grass (Astrebla pectinate), with other herbs and grass growing during periods of wet climatic conditions. The climate is arid, with low annual rainfall and high summer temperatures (Gibson 2001). At the time of sampling, dingoes, feral cats and rabbits were present in the park. Bilbies in the park are also predated upon by birds of prey (ML personal observations).

Fig. 1
figure 1

(a) Map of Australia showing the approximate location of Astrebla Downs National Park. (b) Map of Queensland with the exact location of Astrebla Downs National Park (1740 km2). (c) Map of Astrebla Downs National Park. The green circles indicate the locations of the burrows where odour recognition studies on wild greater bilbies (Macrotis lagotis) were conducted

Sources and storage of treatment odours

We used faeces from two placental predators, with which wild bilbies have shared varying periods of co-evolutionary history (dingoes/dogs (> 8000 years), cats (< 200 years)), as well as a procedural control (faeces from harmless herbivore, rabbit < 200 years), and a experimental control which was no faeces present. We collected fresh faeces from domestic dog, cat and rabbit sources. Although it would have been preferable to use dingo faeces, we used domestic dog scats as a surrogate for dingo scats because obtaining the required quantity of dingo scats was not feasible at the time of the study and previous studies have shown that scats from dogs are chemically indistinguishable from those of dingoes (Carthey et al. 2017). Hereafter, we refer to these scents as dingo/dog. To overcome the issue of decomposition of faecal odours after deposition, all faecal samples were collected fresh from private pet owners and boarding kennel facilities, and stored and sealed in airtight zip lock bags, and frozen at minus 20 °C (Carthey et al. 2017). Disposable gloves were worn at all times when handling faeces to prevent cross contamination of odours. As faecal samples were collected from private pet owners and boarding kennel shared yard facilities, the total number of donor individuals is unknown; however, we estimate that samples came from between two and fifteen individuals. Faeces allocation to burrows was randomised throughout the experiment, reducing the chance of potential donor effects. We did not consider diet to be a potential confounding source during analysis, since the diet of domestic cats, dogs and rabbits was consistent between individuals and made up of a mix of raw meats and commercially available pet foods (Carthey et al. 2017).

Bilby behaviour

As we were unable to track individual bilbies, we conducted a population-level evaluation of bilby behaviour by conducting our experiments adjacent to bilby burrows and treating each burrow individually. As studying wild populations of bilbies can often be problematic due to their cryptic nature, placing faecal odour treatments outside the entrances of active burrows was the most effective way to test population-level predator odour recognition and behavioural responses (Steindler et al. 2018). Active bilby burrows (Fig. 2) were identified by the presence of fresh scats and/or fresh diggings around their entrances. Although seasonally and across years bilby burrow use is in a constant state of change, bilbies are known to use two to three burrows per night (Lavery and Kirkpatrick 1997). Based on population estimates developed by Lavery and Kirkpatrick (1997), we estimate from the 128 burrows we examined in October 2016 that there were at least 30 bilbies present within our study area.

Fig. 2
figure 2

(A) Greater bilby and (B) experimental setup for predator odour discrimination study of wild bilbies at Astrebla Downs National Park. Infrared motion sensor video camera mounted on a metal post outside the burrow entrance of a wild bilby, where odour treatments (cat, dog and rabbit faeces and experimental control—no odour) were presented

We used a repeated measures design, in which each faecal odour treatment was presented once at each burrow, according to a predetermined balanced order. We controlled for order effects experimentally and assessed these effects statistically. Faecal odour treatments were presented on consecutive nights. Since many mammalian predators scent mark features in the landscape, such as the burrows of prey species, by depositing urinary and faecal odours (Gorman and Trowbridge 1989), we deployed faeces at the entrance of bilbies’ burrows. Faeces were presented on the surface of the ground, within 20 cm of the burrow entrance. In cases where there were multiple burrow entrances present, faeces were only presented at the burrow entrance that showed most recent signs of activity. One piece of cat and dog faeces of similar size and weight (approximately 25–30 g) and 20 pellets of rabbit faeces were presented outside the burrow accordingly. Faeces and all faecal traces, including a fine layer of sand on which the faeces were placed, were removed the following day post treatment. Faecal odours were replaced as per the predetermined balanced order for the duration of the experiment per burrow.

At each burrow a Scoutguard SG550V or Scoutguard Zeroglow (Scoutguard; Australia), an infrared motion sensor video camera was mounted to the metal post, 20–100 cm off the ground (Fig. 2). Cameras were programmed to take 60-s video, when triggered, to enable species identification and observe behavioural responses to the odour treatments, with a 0-s interval between possible triggers, from dusk until dawn (1800–0600 h).

Behavioural scoring

We constructed an ethogram of bilby behaviours (Table 1) based upon the initial observations of experimental videos. All behaviours were treated as mutually exclusive (Blumstein and Daniel 2007). We scored video recordings ≤ 60 s using the event recorder JWatcher (Blumstein and Daniel 2007), only quantifying the first 60-s video footage from each burrow location when a bilby was present, with scoring commencing at the start of each 60-s video. We did this because our study focused on quantifying bilbies’ initial behavioural responses to the presence of predator faeces and we wanted to eliminate the potential for our observations to be influenced by habituation to the presence of faeces. As bilbies may not have been within the field of view of the camera at the commencement of scoring, we were unable to analyse total length of time for each behaviour and as a result calculated the proportion of time in sight allocated to each behaviour. As bilbies were unmarked, we were unable to differentiate between individuals and only scored one video per night, per burrow.

Table 1 Ethogram of greater bilby (Macrotis lagotis) behaviour

For analysis, we combined behaviours in which bilbies were digging outside the burrow entrance and digging within the field of view of the camera, to create a new category ‘digging’ (Table 1). We combined behaviours in which bilbies moved slowly: slow approach (slow movement towards odour treatment and/or burrow), slow entrance (individual enters burrow slowly), slow exit (individual exits burrow slowly) and slow retreat (slow movement away from odour treatment and/or burrow) to form a new category ‘walk’ (Table 1). We combined behaviours in which bilbies moved rapidly: fast approach (rapid movement towards odour treatment and/or burrow), fast entrance (individual enters burrow quickly), fast exit (individual exits burrow rapidly) and fast retreat (rapid movement away from odour treatment and/or burrow) to create the new category ‘run’ (Table 1). In most cases, videos were scored blind with respect to treatment, unless it was possible to visually identify the type of faeces that was deployed.

Analysis of behavioural data

We fitted a series of linear mixed effects models in SPSS-25 (IBM Corp., Armonk, NY, USA) with diagonal error structure to test whether faecal odour treatment caused wild bilbies to allocate different proportions of time to the composite behaviours: investigate odour, digging, bi-pedal stance, walk and run. We had two fixed effects: treatment (cat, dog, rabbit and control) and presentation order (1 to 4) in our models. To account for the possibility of non-independence between observations, we included burrow id (1 to 128) as a random effect. In no case was presentation order significant; however, we retained it as a (repeated measure in the analysis to control for its effect statistically: Quinn and Keough 2002). Because the response variables were not normally distributed and the dataset contained many zero values, we log transformed (log10 [behaviour + 1]) each variable prior to analysis to normalise their distributions (Quinn and Keough 2002). Because we wished to understand the pattern of responses, in instances where the effect of odour was significant (P < 0.05), we used Fisher’s least significant difference (LSD) post hoc analysis to examine planned comparisons (cat vs. dog, cat vs. rabbit, cat vs. control, dog vs. rabbit, dog vs. control and rabbit vs. control) for differences in response to each odour treatment. We set our alpha to 0.05 for all tests.

To test whether burrow location influenced bilby behavioural responses to the odour treatments, we tested for spatial autocorrelation in the residuals of the fitted values for each behaviour, using Moran’s index (i), calculated in the spatial analyst module of ArcGis v10.2. Spatial autocorrelation occurs when the value of a variable at any one location in space can be predicted by the values of nearby locations. The existence of spatial autocorrelation indicates that sampling units are not independent from one another (Fortin and Dale 2005).

Results

There was a significant effect of treatment on the proportion of time that bilbies spent investigating faecal odours (Table 2F3,141.361 = 7.073; P ≤ 0.005; Fig. 3a). Planned post hoc comparisons (Table S2 in the Supplementary information) revealed that bilbies spent more time investigating predator faecal odours compared to the experimental control (no faeces) (Fisher’s LSD, cat vs. control, P ≤ 0.005 and dog vs. control, P ≤ 0.005; Fig. 3a) and a harmless herbivore (rabbit) (Fisher’s LSD, cat vs. rabbit, P = 0.025 and dingo/dog vs. rabbit, P ≤ 0.005; Fig. 3a). There was no significant difference in the time spent investigating cat and dog faeces (Fisher’s LSD, cat vs. dingo/dog, P = 0.277) and rabbit faeces and the control (Fisher’s LSD, rabbit vs. control, P = 0.403).

Table 2 Results from a series of linear mixed effects models testing for differences between odour treatments (cat, dog, rabbit, and experimental control—no odour) on the mean log proportion of time spent (log10[behaviour + 1]) on each behaviour by wild greater bilbies (Macrotis lagotis)
Fig. 3
figure 3

The mean (± 1 SEMs)  proportion of time in sight (PIS) that wild greater bilbies allocated to the behaviours (a) investigate odour, (b) digging, (c) bi-Pedal stance, (d) walk and (e) run in response to faecal odour treatments (cat, n = 44, experimental control (no odour), n = 35; dog, n = 38; and rabbit, n = 36) outside bilby burrows. Similar letters (e.g. A or B) above bars identify pairwise comparisons that are not statistically distinguishable (P > 0.05) for response variables where a significant main effect was observed

There was a significant effect of treatment on the proportion of time that bilbies allocated to digging (Table 2F3,131.405 = 2.715, P = 0.047; Fig. 3b). Bilbies spent less time digging outside the burrow entrance and within the vicinity of the burrow when predator faeces were present compared to the experimental control (no faeces) (Fisher’s LSD, cat vs. control, P = 0.038 and dingo/dog vs. control, P = 0.008; Fig. 3b). There was no difference between the time spent digging when predator faeces were present (Fisher’s LSD, cat vs. dingo/dog, P = 0.462). There was no difference in the proportion of time spent digging when cat and rabbit faeces (Fisher’s LSD, cat vs. rabbit, P = 0.427), dingo/dog and rabbit faeces (Fisher’s LSD, dingo/dog vs. rabbit, P = 0.144) and rabbit faeces and the control (Fisher’s LSD, rabbit vs. control, P = 0.215) were present (Table S2 in the Supplementary information).

There was a significant effect of treatment on the proportion of time that bilbies engaged in bi-pedal stance (Table 2F3,108.206 = 4.572; P = 0.005; Fig. 3c). Bilbies spent less time in a bi-pedal stance when faecal odour treatments were present compared to the experimental control (no faeces) (Fisher’s LSD, cat vs. control, P = 0.018, dingo/dog vs. control, P = 0.006 and rabbit vs. control, P ≤ 0.005; Fig. 3c). There was no difference in the proportion of time allocated to bi-pedal stance when predator (cat and dog) and harmless herbivore (rabbit) faeces were present (Fisher’s LSD, cat vs. dingo/dog, P = 0.604, cat vs. rabbit, P = 0.162 and dingo/dog vs. rabbit, P = 0.380; Fig. 3c).

There was no effect of treatment on the proportion of time that bilbies allocated to walking (Table 2F3,143.811 = 0.694, P = 0.557; Fig. 3d) and running (F3,110.065 = 0.403, P = 0.751; Fig. 3e). There was no spatial autocorrelation in the residuals of the fitted values for any of the analysed behaviours for bilbies (Table S1 in the Supplementary information). These results indicate that the burrows and treatment sites were independent for the purpose of our analysis.

Discussion

Our results show that bilbies living outside of a safe-haven displayed anti-predator responses towards the olfactory cues of both a long-term predator (dingoes/dogs) and an evolutionary novel predator (cats). However, from previous research, we know that safe-havened bilbies that were completely isolated from all mammalian predators responded to the faecal odours of their long-term historical predator, the dingo/dog but not cats (Steindler et al. 2018). These contrasting findings suggest that anti-predator responses displayed by non-safe-havened bilbies towards cat odour may be the result of lifetime learning (Turner et al. 2006; Saul and Jeschke 2015) or selection for individuals that have learnt and developed appropriate anti-predator responses over evolutionary time (Kovacs et al. 2012).

Bilbies spent the greatest proportion of time investigating and the least amount of time digging when cat and dingo/dog faeces were present. These findings may be due to bilbies making a trade-off between the costs and benefits of these behaviours (Lima and Dill 1990). Recognition of predator odour cues allows prey to perform anti-predator responses that will increase their chances of survival (Chivers et al. 1995). However, prey animals require information to make these decisions (Bouskila and Blumstein 1992) and often exploit the chemosensory cues found in faeces to provide information on predator activity level and diet (Ferrero et al. 2011). Thus, approaching and investigating predator cues may allow prey individuals to assess the situation and modify their behaviour according to the perceived predatory threat (Lima and Dill 1990; Cremona et al. 2014; Carthey and Banks 2018). In the case of bilbies, investigating predator scats may have enabled individuals to assess the likelihood for a potential lethal encounter with a cat and/or dingo within the area. Although it is important to note that we were unable to test for whether predator recognition confers survival benefits for bilbies. Further research is required to determine whether predator recognition and the effectiveness of bilbies anti-predator responses are linked.

Bilbies invested the least proportion of time to standing bi-pedal when predator and herbivore faeces were present compared to the control (no odour). Bilbies typically adopt the upright bi-pedal posture when entering or leaving a burrow, or when foraging (Johnson and Johnson 1983). That bilbies equally reduced the proportion of time they stood bi-pedal when rabbit and predator odours were present suggests that this behaviour was not an anti-predator response.

Previous studies have suggested that naïve species generalise their response to predators, irrespective of their evolutionary and/or lifetime experience, as a result of the common constituents (Dickman and Doncaster 1984; Nolte et al. 1994), such as (kairomones) found in carnivore odours (Ferrero et al. 2011). Although bilbies responded to both cat and dingo/dog odour through increased investigation and decreased digging compared to the experimental control (no odour), we do not believe that this was a result of generalisation. Aversion to all carnivore smells may be costly in terms of missed opportunities, such as foraging and mate selection (Lima and Bednekoff 1999). Naïve prey are at a selective advantage if they are able to learn and respond to specific predatory smells, rather than respond to all carnivorous smells (Blumstein et al. 2002; Powell and Banks 2004). Barrio et al. (2010) suggested that the common constituent’s hypothesis may only apply when taxonomic levels are closely related. Since cats and dogs diverged between 52 and 57 million years ago (Hedges et al. 2006), the differences between the two families could be too great for bilbies to generalise the odours. Within the study area, both dingoes and cats are known to predate on bilbies (Lollback et al. 2015). Thus, we suggest that exposure to cat and dingo predation over evolutionary time and throughout their lifetime is likely to be a greater driver for the predator response behaviour displayed by wild bilbies in this study, rather than a generalised response to predator odours per se.

A caveat of our study is that we were unable to test for whether the use of alternative odour sources, such as whole body odour or urine, may have led to similar or different results. For example, laboratory rats respond more strongly and consistently to whole body odour than to urine or faecal odours (Masini et al. 2005). Blanchard and Blanchard (2004) suggest that the different responses displayed by rats towards body odour and faecal odour may be explained by the rapid dissipation of body odours in the environment, a consequence of which means that fresh body odour indicates imminent danger. We assumed that bilbies’ behavioural responses to faecal odours were a product of their evolutionary history with dingoes and cats. However, based on the research by Masini et al. (2005), as well as the idea that whole body odour samples indicate more imminent risk to prey (Carthey and Banks 2014), this may not be the case. As such, we recommend that further field studies are undertaken in order to discern the influence of the type of odour used and whether different types of odours elicit different behavioural responses by bilbies.

Our results support the idea that ‘naïve’ prey will not remain eternally naïve and have the ability to respond and develop appropriate anti-predator responses towards introduced predators (Banks et al. 2018). However, it is unclear whether these predator recognition abilities have become ‘hard-wired’ or whether they are experience dependent. For example, phenotypic plasticity and learning may provide a valuable short-term response to change, but hinder the potential for long-term adaptation (Schlaepfer et al. 2005). As such, in order to successfully manage bilbies and other predator ‘naïve’ species towards introduced predators, we need to better understand the heritability of anti-predator behaviours and whether introduced predator recognition abilities are lost and/or gained through lifetime experience (Carthey and Blumstein 2018).

In regions where invasive predators pose a threat to native species, one commonly used strategy to mitigate predator impacts is to establish refuge populations of native species within ‘safe-havens’, such as predator-free islands or within predator-free fenced reserves (Legge et al. 2018). However, completely isolating populations from predators runs the risk of creating predator naïve populations. This is because populations that are isolated from predators may lose their anti-predator responses due to relaxed selection and limited opportunities for learning how to respond to predators (Moseby et al. 2016; Jolly et al. 2018).

In the case of bilbies, the results of this study suggest that bilbies living outside of a safe-haven recognise cats as a threat, whilst Steindler et al. (2018) found that bilbies living within a safe-haven did not respond to cat scent. The contrasting behavioural response to cat scent displayed by wild bilbies living within and outside of safe-havens has implications for managing populations of bilbies and other endangered mammals within safe-havens. This is because they suggest that naïve prey species, such as bilbies, can acquire anti-predator responses when their populations are exposed to predators (Ross et al. 2019), and that completely isolating prey from predators may compromise their anti-predator responses (Moseby et al. 2016; Jolly et al. 2018). One potential solution that has been proposed to tackle the problem of prey naiveté within safe-havened populations is to expose these populations to predators under carefully controlled conditions (Ross et al. 2019; Jolly and Phillips 2020), taking advantage of the behavioural responses that predators can induce in their prey. However, the challenge with such an approach will be providing the appropriate conditions necessary for anti-predator skills to be retained and/or developed, whilst also ensuring that prey populations are not driven extinct by predation.