Risk sensitive foraging: an experimental study of a solitary marsupial.

2005 ◽  
Vol 27 (2) ◽  
pp. 231 ◽  
Author(s):  
LN Evans ◽  
MA Elgar ◽  
KA Handasyde

DETECTION and avoidance of predators are the principle strategies employed by prey to evade attack; by scanning their environment, prey individuals can reduce the likelihood of a predator approaching to within striking distance (Elgar 1989; Lima and Dill 1990). However, vigilance is often incompatible with foraging behaviours, and thus animals may be forced to trade-off the risk of predation against acquiring food. Consequently, the quality of a particular resource patch and its associated predation risk may influence the foraging decisions of animals (Werner et al. 1983; Newman and Caraco 1987; Heithaus and Dill 2002). Cover is an important feature of a foraging site because it can provide a hiding place to escape potential predators (Lazarus and Symonds 1992). Thus, animals may prefer foraging sites that are close to cover, or adjust their level of vigilance at different distances from cover in order to compensate for changes in the chance of early detection and escape (Elgar 1989; Lima and Dill 1990; Lima et al. 1985; Kramer and Bonenfant 1997).

2020 ◽  
Vol 98 (8) ◽  
pp. 541-550
Author(s):  
F. Bjornson ◽  
M. Earhart ◽  
W.G. Anderson

Balancing foraging opportunities with predation risk can promote complex behavioural strategies in juvenile fishes, particularly in northern temperate environments with short growing seasons. To test how predation experience may influence foraging effort and risk assessment of juvenile lake sturgeon (Acipenser fulvescens Rafinesque, 1817), flight response and substrate preference behavioural measurements were taken during critical life periods of early exogenous feeding (∼60 days post fertilization (dpf)) and pre-winter (∼160 dpf). Lake sturgeon were placed in arenas with partial cover and exposed white plastic bottom. Chemical alarm cue (AC) was introduced to predator naïve individuals in the presence or absence of food over the exposed portion of the arena to simulate risk sensitive foraging over diurnal and seasonal periods. The same protocol was run on predator-experienced individuals, which were classically conditioned to predator cue (PC) prior to the trials. Whole-body cortisol measures were also taken to determine the physiological response to predation experience. Results suggest a propensity to forage in spite of predation risk during the naïve ∼60 dpf trials and highlight context-specific anti-predator responses of naïve and experienced lake sturgeon. Elevated basal whole-body cortisol levels and reduced body condition (p < 0.05) were observed with increased predator experience.


2007 ◽  
Vol 28 (2) ◽  
pp. 304-308 ◽  
Author(s):  
Don Bradshaw ◽  
Xavier Bonnet ◽  
Fabien Aubret

AbstractForaging behaviour is influenced by an animal's level of hunger, and may reflect a trade-off between optimizing food acquisition and avoiding predation. Young tiger snakes were raised either on a high or low food diet and exposed to a predation threat while foraging. Under these circumstances, lower condition snakes (low food diet) were prone to take additional feeding/foraging risks: food was accepted at a much higher rate compared with the higher condition animals (high food diet) that were less inclined to risk feeding under a predation threat. This study provides the first direct example of predation risk-associated foraging decisions in snakes.


2015 ◽  
Vol 282 (1810) ◽  
pp. 20150124 ◽  
Author(s):  
Tracey Hollings ◽  
Hamish McCallum ◽  
Kaely Kreger ◽  
Nick Mooney ◽  
Menna Jones

Apex predators structure ecosystems through lethal and non-lethal interactions with prey, and their global decline is causing loss of ecological function. Behavioural changes of prey are some of the most rapid responses to predator decline and may act as an early indicator of cascading effects. The Tasmanian devil ( Sarcophilus harrisii ), an apex predator, is undergoing progressive and extensive population decline, of more than 90% in long-diseased areas, caused by a novel disease. Time since local disease outbreak correlates with devil population declines and thus predation risk. We used hair traps and giving-up densities (GUDs) in food patches to test whether a major prey species of devils, the arboreal common brushtail possum ( Trichosurus vulpecula ), is responsive to the changing risk of predation when they forage on the ground. Possums spend more time on the ground, discover food patches faster and forage more to a lower GUD with increasing years since disease outbreak and greater devil population decline. Loss of top–down effects of devils with respect to predation risk was evident at 90% devil population decline, with possum behaviour indistinguishable from a devil-free island. Alternative predators may help to maintain risk-sensitive anti-predator behaviours in possums while devil populations remain low.


2019 ◽  
Vol 286 (1907) ◽  
pp. 20190826 ◽  
Author(s):  
Jesse Balaban-Feld ◽  
William A. Mitchell ◽  
Burt P. Kotler ◽  
Sundararaj Vijayan ◽  
Lotan T. Tov Elem ◽  
...  

Refuges offer prey animals protection from predation, but increased time spent hiding can reduce foraging opportunities. Within social groups, individuals vary in their refuge use and willingness to forage in the presence of a predator. Here, we examine the relative foraging benefits and mortality costs associated with individual refuge use and foraging behaviour within groups of goldfish ( Carassius auratus ) under predation risk from an avian predator (little egret— Egretta garzetta ). We assessed individual order of emergence from the refuge and participation over 15 group foraging outings, and assigned each fish a daily outing index score. The individual fish that emerged from the refuge earlier than the other group members and that participated in more outings received high outing index scores and consumed more food compared with fish that tended to emerge in posterior positions and participate in fewer outings. However, individual fish that attained high outing index scores suffered a higher risk of predation. Furthermore, the amount of time the egret spent at the pool affected group foraging behaviour: as predation risk increased, groups of fish consumed significantly less food. Our results exemplify the trade-off between foraging success and safety from predation that prey species regularly experience.


2017 ◽  
Vol 284 (1858) ◽  
pp. 20170757 ◽  
Author(s):  
Philip D. DeWitt ◽  
Matthew S. Schuler ◽  
Darcy R. Visscher ◽  
Richard P. Thiel

Animal populations are regulated by the combined effects of top-down, bottom-up and abiotic processes. Ecologists have struggled to isolate these mechanisms because their effects on prey behaviour, nutrition, security and fitness are often interrelated. We monitored how forage, non-consumptive effects (NCEs), consumptive predation and climatic conditions influenced the demography and nutritional state of a wild prey population during predator recolonization. Combined measures of nutrition, survival and population growth reveal that predators imposed strong effects on the prey population through interacting non-consumptive and consumptive effects, and forage mechanisms. Predation was directly responsible for adult survival, while declining recruitment was attributed to predation risk-sensitive foraging, manifested in poor female nutrition and juvenile recruitment. Substituting nutritional state into the recruitment model through a shared term reveals that predation risk-sensitive foraging was nearly twice as influential as summer forage conditions. Our findings provide a novel, mechanistic insight into the complex means by which predators and forage conditions affect prey populations, and point to a need for more ecological studies that integrate behaviour, nutrition and demography. This line of inquiry can provide further insight into how NCEs interactively contribute to the dynamics of terrestrial prey populations; particularly, how predation risk-sensitive foraging has the potential to stabilize predator–prey coexistence.


2016 ◽  
Vol 73 (6) ◽  
pp. 869-876 ◽  
Author(s):  
Melissa Pink ◽  
Mark V. Abrahams

Metabolic rates of fish and their activity levels have thermal optima. When environmental temperatures are below these optima, increasing temperature will increase their rates of energy consumption, resulting in a corresponding increase in the risk of starvation. For that reason we predicted that within this temperature range, food is of greater value at higher temperatures so fish should be willing to incur greater costs to obtain it. To test this hypothesis, we measured how the activity and foraging rates of the fathead minnow (Pimephales promelas) changed with temperature at 4, 15, and 24 °C. As expected, fish activity and foraging were greater at higher temperatures. We then measured the impact of predation risk on foraging decisions at 5, 15, and 23 °C. At 5 and 15 °C, the risk of predation had a significant effect on foraging decisions, but there was no effect at 23 °C. These results demonstrate that increasing temperatures below their optimal level diminish the impact of predation risk on foraging behaviour and may mean that the direct consumptive effect of predators on aquatic communities will be greater at warmer temperatures while the risk of predation will become a less important factor, and vice versa.


2020 ◽  
Vol 32 (2) ◽  
pp. 91-101
Author(s):  
Steven E. Kaplan ◽  
Danny Lanier ◽  
Kelly R. Pope ◽  
Janet A. Samuels

ABSTRACT Whistleblowing reports, if properly investigated, facilitate the early detection of fraud. Although critical, investigation-related decisions represent a relatively underexplored component of the whistleblowing process. Investigators are responsible for initially deciding whether to follow-up on reports alleging fraud. We report the results of an experimental study examining the follow-up intentions of highly experienced healthcare investigators. Participants, in the role of an insurance investigator, are asked to review a whistleblowing report alleging billing fraud occurring at a medical provider. Thus, participants are serving as external investigators. In a between-participant design, we manipulate the report type and whether the caller previously confronted the wrongdoer. We find that compared to an anonymous report, a non-anonymous report is perceived as more credible and follow-up intentions stronger. We also find that perceived credibility fully mediates the relationship between report type and follow-up intentions. Previous confrontation is not significantly associated with either perceived credibility or follow-up intentions. Data Availability: Data are available upon request.


Sign in / Sign up

Export Citation Format

Share Document