scholarly journals Testing Whether Camera Presence Influences Habitat Specific Predation Pressure on Artificial Shorebird Nests in the Arctic

ARCTIC ◽  
2021 ◽  
Vol 74 (1) ◽  
pp. 22-29
Author(s):  
Kevin G. Young ◽  
Lisa V. Kennedy ◽  
Paul A. Smith ◽  
Erica Nol

When monitoring the breeding ecology of birds, the causes and times of nest failure can be difficult to determine. Cameras placed near nests allow for accurate monitoring of nest fate, but their presence may increase the risk of predation by attracting predators, leading to biased results. The relative influence of cameras on nest predation risk may also depend on habitat because predator numbers or behaviour can change in response to the availability or accessibility of nests. We evaluated the impact of camera presence on the predation rate of artificial nests placed within mesic tundra habitats used by Arctic-breeding shorebirds. We deployed 94 artificial nests, half with cameras and half without, during the shorebird-nesting season of 2015 in the East Bay Migratory Bird Sanctuary, Nunavut. Artificial nests were distributed evenly across sedge meadow and supratidal habitats typically used by nesting shorebirds. We used the Cox proportional hazards model to assess differential nest survival in relation to camera presence, habitat type, placement date, and all potential interactions. Artificial nests with cameras did not experience higher predation risk than those without cameras. Predation risk of artificial nests was related to an interaction between habitat type and placement date. Nests deployed in sedge meadows and in supratidal habitats later in the season were subject to a higher risk of predation than those deployed in supratidal habitats early in the season. These differences in predation risk are likely driven by the foraging behaviour of Arctic fox (Vulpes lagopus), a species that accounted for 81% of observed predation events in this study. Arctic fox prey primarily on Arvicoline prey and goose eggs at this site and take shorebird nests opportunistically, perhaps more often later in the season when their preferred prey becomes scarcer. This study demonstrates that, at this site, cameras used for nest monitoring do not influence predation risk. Evaluating the impact of cameras on predation risk is critical prior to their use, as individual study areas may differ in terms of predator species and behaviour.

Crisis ◽  
2018 ◽  
Vol 39 (1) ◽  
pp. 27-36 ◽  
Author(s):  
Kuan-Ying Lee ◽  
Chung-Yi Li ◽  
Kun-Chia Chang ◽  
Tsung-Hsueh Lu ◽  
Ying-Yeh Chen

Abstract. Background: We investigated the age at exposure to parental suicide and the risk of subsequent suicide completion in young people. The impact of parental and offspring sex was also examined. Method: Using a cohort study design, we linked Taiwan's Birth Registry (1978–1997) with Taiwan's Death Registry (1985–2009) and identified 40,249 children who had experienced maternal suicide (n = 14,431), paternal suicide (n = 26,887), or the suicide of both parents (n = 281). Each exposed child was matched to 10 children of the same sex and birth year whose parents were still alive. This yielded a total of 398,081 children for our non-exposed cohort. A Cox proportional hazards model was used to compare the suicide risk of the exposed and non-exposed groups. Results: Compared with the non-exposed group, offspring who were exposed to parental suicide were 3.91 times (95% confidence interval [CI] = 3.10–4.92 more likely to die by suicide after adjusting for baseline characteristics. The risk of suicide seemed to be lower in older male offspring (HR = 3.94, 95% CI = 2.57–6.06), but higher in older female offspring (HR = 5.30, 95% CI = 3.05–9.22). Stratified analyses based on parental sex revealed similar patterns as the combined analysis. Limitations: As only register-­based data were used, we were not able to explore the impact of variables not contained in the data set, such as the role of mental illness. Conclusion: Our findings suggest a prominent elevation in the risk of suicide among offspring who lost their parents to suicide. The risk elevation differed according to the sex of the afflicted offspring as well as to their age at exposure.


Antibiotics ◽  
2021 ◽  
Vol 10 (2) ◽  
pp. 105
Author(s):  
Jatapat Hemapanpairoa ◽  
Dhitiwat Changpradub ◽  
Sudaluck Thunyaharn ◽  
Wichai Santimaleeworagun

The prevalence of enterococcal infection, especially E. faecium, is increasing, and the issue of the impact of vancomycin resistance on clinical outcomes is controversial. This study aimed to investigate the clinical outcomes of infection caused by E. faecium and determine the risk factors associated with mortality. This retrospective study was performed at the Phramongkutklao Hospital during the period from 2014 to 2018. One hundred and forty-five patients with E. faecium infections were enrolled. The 30-day and 90-day mortality rates of patients infected with vancomycin resistant (VR)-E. faecium vs. vancomycin susceptible (VS)-E. faecium were 57.7% vs. 38.7% and 69.2% vs. 47.1%, respectively. The median length of hospitalization was significantly longer in patients with VR-E. faecium infection. In logistic regression analysis, VR-E. faecium, Sequential Organ Failure Assessment (SOFA) scores, and bone and joint infections were significant risk factors associated with both 30-day and 90-day mortality. Moreover, Cox proportional hazards model showed that VR-E. faecium infection (HR 1.91; 95%CI 1.09–3.37), SOFA scores of 6–9 points (HR 2.69; 95%CI 1.15–6.29), SOFA scores ≥ 10 points (HR 3.71; 95%CI 1.70–8.13), and bone and joint infections (HR 0.08; 95%CI 0.01–0.62) were significant risk factors for mortality. In conclusion, the present study confirmed the impact of VR-E. faecium infection on mortality and hospitalization duration. Thus, the appropriate antibiotic regimen for VR-E. faecium infection, especially for severely ill patients, is an effective strategy for improving treatment outcomes.


Crustaceana ◽  
2015 ◽  
Vol 88 (7-8) ◽  
pp. 839-856 ◽  
Author(s):  
J. Hesse ◽  
J. A. Stanley ◽  
A. G. Jeffs

Kelp habitats are in decline in many temperate coastal regions of the world due to climate change and expansion of populations of grazing urchins. The loss of kelp habitat may influence the vulnerability to predators of the juveniles of commercially important species. In this study relative predation rates for kelp versus barren reef habitat were measured for early juvenile Australasian spiny lobster, Jasus edwardsii (Hutton, 1875), on the northeastern coast of New Zealand using tethering methods. Variation in assemblages of predators over small spatial scales appeared to be more important for determining the relative predation of lobsters, regardless of habitat type. Therefore, the assessment of relative predation risk to early juvenile lobsters between kelp and barren habitats will require more extensive sampling at a small spatial scale, as well as a specific focus on sampling during crepuscular and nocturnal periods when these lobsters are most at risk of predation.


2010 ◽  
Vol 37 (4) ◽  
pp. 273 ◽  
Author(s):  
Karen Fey ◽  
Peter B. Banks ◽  
Hannu Ylönen ◽  
Erkki Korpimäki

Context. Potential mammalian prey commonly use the odours of their co-evolved predators to manage their risks of predation. But when the risk comes from an unknown source of predation, odours might not be perceived as dangerous, and anti-predator responses may fail, except possibly if the alien predator is of the same archetype as a native predator. Aims. In the present study we examined anti-predator behavioural responses of voles from the outer archipelagos of the Baltic Sea, south-western Finland, where they have had no resident mammalian predators in recent history. Methods. We investigated responses of field voles (Microtus agrestis) to odours of native least weasels (Mustela nivalis) and a recently invading alien predator, the American mink (Mustela vison), in laboratory. We also studied the short-term responses of free-ranging field voles and bank voles (Myodes glareolus) to simulated predation risk by alien mink on small islands in the outer archipelago of the Baltic Sea. Key results. In the laboratory, voles avoided odour cues of native weasel but not of alien mink. It is possible that the response to mink is a context dependent learned response which could not be induced in the laboratory, whereas the response to weasel is innate. In the field, however, voles reduced activity during their normal peak-activity times at night as a response to simulated alien-mink predation risk. No other shifts in space use or activity in safer microhabitats or denser vegetation were apparent. Conclusions. Voles appeared to recognise alien minks as predators from their odours in the wild. However, reduction in activity is likely to be only a short-term immediate response to mink presence, which is augmented by longer-term strategies of habitat shift. Because alien mink still strongly suppresses vole dynamics despite these anti-predator responses, we suggest that behavioural naiveté may be the primary factor in the impact of an alien predator on native prey. Implications. Prey naiveté has long been considered as the root cause of the devastating impacts of alien predators, whereby native prey simply fail to recognise and respond to the novel predation risk. Our results reveal a more complex form of naiveté whereby native prey appeared to recognise alien predators as a threat but their response is ultimately inadequate. Thus, recognition alone is unlikely to afford protection for native prey from alien-predator impacts. Thus, management strategies that, for example, train prey in recognition of novel threats must induce effective responses if they are expected to succeed.


Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Tomonori Akasaka ◽  
Seiji Hokimoto ◽  
Noriaki Tabata ◽  
Kenji Sakamoto ◽  
Kenichi Tsujita ◽  
...  

Background: Based on 2011 ACCF/AHA/SCAI PCI guideline, it is recommended that PCI should be performed at hospital with onsite cardiac surgery. But, recent data suggests that there is no significant difference in clinical outcomes following primary or elective PCI between hospitals with and without onsite cardiac surgery. The proportion of PCI centers without onsite cardiac surgery comprises approximately more than half of all PCI centers in Japan. We examined the impact of with or without onsite cardiac surgery on clinical outcomes following PCI to ACS. Methods: From Aug 2008 to March 2011, subjects (n=2288) were enrolled from the Kumamoto Intervention Conference Study (KICS), which is a multicenter registry, and enrolling consecutive patients undergoing PCI in 15 centers in Japan. Patients were assigned to two groups treated in hospitals with (n=1954) or without (n=334) onsite cardiac surgery. Clinical events were followed up for 12 months. Primary endpoint was in-hospital death, cardiovascular death, myocardial infarction, and stroke. And we monitored other events those were non-cardiovascular deaths, bleeding complications, revascularizations, and emergent CABG. Results: There was no overall significant difference in primary endpoint between hospitals with and without onsite cardiac surgery (9.6%vs9.5%; P=0.737). There was also no significant difference when events in primary endpoint were considered separately. In other events, only revascularization was more frequently seen in hospitals with onsite cardiac surgery (22.1%vs12.9%; P<0.001). Kaplan-Meier analysis for primary endpoint showed that there was no significant difference between two groups (Log Rank P=0.943). By cox proportional hazards model analysis for primary endpoint, without onsite cardiac surgery was not a predictive factor for primary endpoint (HR 0.969, 95%CI 0.704-1.333; P=0.845). We performed propensity score matching analysis to correct for the disparate patient numbers between two groups, and there was also no significant difference for primary endpoint (6.9% vs 8.0%; P=0.544). Conclusions: There is no significant difference in clinical outcomes following PCI for ACS between hospitals with and without onsite cardiac surgery backup in Japan.


2021 ◽  
Vol 39 (15_suppl) ◽  
pp. 4562-4562
Author(s):  
Thomas E. Hutson ◽  
Toni K. Choueiri ◽  
Robert J. Motzer ◽  
Sun Young Rha ◽  
Anna Alyasova ◽  
...  

4562 Background: The multicenter, open-label, randomized, phase 3 CLEAR study showed that LEN + EVE had a significant PFS benefit (HR 0.65, 95% CI 0.53-0.80, P<0.001) and improved objective response rate (relative risk 1.48, 95% CI 1.26-1.74) vs SUN in the first-line treatment of patients (pts) with advanced RCC. The difference in overall survival (OS) for LEN + EVE vs SUN was not statistically significant (HR 1.15, 95% CI 0.88-1.50) (Motzer R et al. NEJM. 2021). Post hoc subgroup analyses were performed to assess the impact of subsequent therapy on OS. Methods: Pts in the CLEAR study were randomly assigned (1:1:1) to 1 of 3 treatment arms, including LEN 18 mg + EVE 5 mg once daily (QD) and SUN 50 mg QD (4 weeks on then 2 weeks off). These post hoc analyses examined OS by subsequent systemic anticancer medication in the LEN + EVE and SUN arms. Hazard ratios (HR; LEN + EVE vs SUN) were based on stratified (geographic region and MSKCC prognostic risk groups) Cox proportional hazards model. Results: Among 1069 pts with advanced RCC randomized in the CLEAR study, 714 pts were randomly assigned to the LEN + EVE and SUN arms (N=357/each). The median duration of survival follow-up was 27 months in the LEN + EVE arm and 26 months in the SUN arm. Given the shorter median duration of study treatment with SUN (7.8 months) vs LEN + EVE (11.0 months), more pts in the SUN arm received subsequent anticancer therapy during survival follow-up (LEN + EVE, n=167; SUN, n=206). Among pts who received subsequent therapy, pts in the LEN + EVE arm had a longer median time from randomization to initiation of subsequent therapy vs those in the SUN arm (8.0 vs 6.6 months, respectively). OS for the overall population, for pts with no subsequent anticancer therapy, and for pts with no subsequent immunotherapy is shown in the table. In the US population subgroup (LEN + EVE, n=62; SUN, n=61) of the CLEAR study, in which a similar number of pts received subsequent systemic anticancer therapies in the LEN + EVE vs SUN arms (62.9% vs 65.6%, respectively), OS was comparable among the 2 arms (HR 0.95, 95% CI 0.51-1.76). Overall, the safety profile was consistent with the known safety profiles of LEN + EVE and SUN. In both arms, most treatment-emergent deaths were due to progressive disease; there were few treatment-related deaths (<1%, per arm) and no clustering of events. Conclusions: In the CLEAR study, LEN + EVE met the primary endpoint of a significant benefit in PFS vs SUN. The results of these exploratory analyses suggest that subsequent systemic anticancer therapy affected the OS outcome results for LEN + EVE vs SUN in the CLEAR study. Clinical trial information: NCT02811861. [Table: see text]


2019 ◽  
Vol 50 (2) ◽  
pp. 237-255 ◽  
Author(s):  
Joshua Meyer-Gutbrod

Abstract The U.S. Supreme Court’s decision to grant states the authority to reject Medicaid expansion under the Affordable Care Act without penalty threatened the implementation of this polarized health policy. While many Republican-controlled states followed their national allies and rejected Medicaid expansion, others engaged in bipartisan implementation. Why were some Republican states willing to reject the national partisan agenda and cooperate with Democrats in Washington? I focus on the role of electoral competition within states. I conclude that although electoral competition has been shown to encourage partisan polarization within the states, the combination of intergovernmental implementation and Medicaid expansion’s association with public welfare reverses this dynamic. I employ a Cox proportional-hazards model to examine the impact of state partisan ideology and competition on the likelihood of state Medicaid expansion. I find that strong inter-party competition mitigates the impact of more extreme partisan ideologies, encouraging potentially bipartisan negotiation with the federal administration.


2020 ◽  
Vol 90 (7) ◽  
pp. 1057-1086 ◽  
Author(s):  
Marcelo Cajias ◽  
Philipp Freudenreich ◽  
Anna Freudenreich

Abstract In this paper, the liquidity (inverse of time on market) of rental dwellings and its determinants for different liquidity quantiles are examined for the seven largest German cities. The determinants are estimated using censored quantile regressions in order to investigate the impact on very liquid to very illiquid dwellings. As market heterogeneity is not only observed between cities but also within a city, each of the seven cities is considered individually. Micro data for almost 500,000 observations from 2013 to 2017 is used to examine the time on market. Substantial differences in the magnitude and direction of the regression coefficients for the different liquidity quantiles are found. Furthermore, both the magnitude and direction of the impact of an explanatory variable on the liquidity, differ between the cities. To the best of the authors’ knowledge this is the first paper, to apply censored quantile regressions to liquidity analysis of the real estate rental market. The model reveals that the proportionality assumption underlying the Cox proportional hazards model cannot be confirmed for all variables across all cities, but for most of them.


2018 ◽  
Vol 28 (2) ◽  
pp. 151-156
Author(s):  
Yefei Zhang ◽  
Maha R. Boktour

Introduction: The United Network for Organ Sharing (UNOS) instituted the Share 35 policy in June 2013 in order to reduce death on liver transplant waitlist. The effect of this policy on patient survival among patients with gender- and race-mismatched donors has not been examined. Research Question: To assess the impact of Share 35 policy on posttransplantation patient survival among patients with end-stage liver disease (ESLD) transplanted with gender- and race-mismatched donors. Design: A total of 16 467 adult patients with ESLD who underwent liver transplantation between 2012 and 2015 were identified from UNOS. An overall Cox proportional hazards model adjusting for demographic, clinical, and geographic factors and separate models with a dummy variable of pre- and post-Share 35 periods as well as its interaction with other factors were performed to model the effect of gender and race mismatch on posttransplantation patient survival and to compare the patient survival differences between the first 18 months of Share 35 policy to an equivalent time period before. Results: Comparison of the pre- and post-Share 35 periods did not show significant changes in the numbers of gender- and race-mismatched transplants, or the risk of death for gender-mismatched recipients. However, black recipients with Hispanic donors (hazard ratio: 0.51, 95% confidence interval, 0.29-0.90) had significantly increased patient survival after Share 35 policy took effect. Conclusion: The Share 35 policy had a moderate impact on posttransplantation patient survival among recipients with racially mismatched donors according to the first 18-month experience. Future research is recommended to explore long-term transplantation.


2014 ◽  
Vol 34 (3) ◽  
pp. 289-298 ◽  
Author(s):  
Jernej Pajek ◽  
Alastair J. Hutchison ◽  
Shiv Bhutani ◽  
Paul E.C. Brenchley ◽  
Helen Hurst ◽  
...  

BackgroundWe performed a review of a large incident peritoneal dialysis cohort to establish the impact of current practice and that of switching to hemodialysis.MethodsPatients starting peritoneal dialysis between 2004 and 2010 were included and clinical data at start of dialysis recorded. Competing risk analysis and Cox proportional hazards model with time-varying covariate (technique failure) were used.ResultsOf 286 patients (median age 57 years) followed for a median of 24.2 months, 76 were transplanted and 102 died. Outcome probabilities at 3 and 5 years respectively were 0.69 and 0.53 for patient survival (or transplantation) and 0.33 and 0.42 for technique failure. Peritonitis caused technique failure in 42%, but ultrafiltration failure accounted only for 6.3%. Davies comorbidity grade, creatinine and obesity (but not residual renal function or age) predicted technique failure. Due to peritonitis deaths, technique failure was an independent predictor of death hazard. When successful switch to hemodialysis (surviving more than 60 days after technique failure) and its timing were analyzed, no adverse impact on survival in adjusted analysis was found. However, hemodialysis via central venous line was associated with an elevated death hazard as compared to staying on peritoneal dialysis, or hemodialysis through a fistula (adjusted analysis hazard ratio 1.97 (1.02 – 3.80)).ConclusionsOnce the patients survive the first 60 days after technique failure, the switch to hemodialysis does not adversely affect patient outcomes. The nature of vascular access has a significant impact on outcome after peritoneal dialysis failure.


Sign in / Sign up

Export Citation Format

Share Document