Effects of natural weathering conditions on faecal cortisol metabolite measurements in the greater bilby (Macrotis lagotis)

2013 ◽  
Vol 61 (5) ◽  
pp. 351 ◽  
Author(s):  
Nicole Evans ◽  
Edward J. Narayan ◽  
Jean-Marc Hero

Natural weathering conditions can influence faecal cortisol metabolite (FCM) measurements in wildlife if fresh faeces cannot be collected immediately following defaecation. In this study, we evaluated this issue in a threatened Australian marsupial, the greater bilby (Macrotis lagotis). Fresh (<12 h since defaecation) faecal samples (n = 19 pellets per bilby) were collected one morning from seven adult bilbies kept in captivity. One control faecal sample (Day 1) from each bilby was immediately frozen. The remaining faecal pellets were randomly positioned outdoors. Subsequently, we froze one faecal pellet every 24 h for 19 days. FCM levels in bilby faeces were quantified using an enzyme-immunoassay. Mean FCM levels showed variation (daily mean coefficients of variation [CV %]) of 56.83–171.65% over 19 days. Overall, FCM levels were affected by exposure time; however, multiple comparisons showed that no significant change in FCM occurred after environmental exposure (no significant difference in mean FCM between control (Day 1) with any of the exposure days (Days 2–19). Individuals and sex also affected FCM levels. We found no correlation between mean daily CVs with daily minimum–maximum temperatures or rainfall. Our results indicate that FCM in bilby faeces is fairly stable to long-term environmental exposure (19 days). In future, freshly excreted bilby faeces (where the sample maintains a distinct odour for 9–13 days) should be used to study FCM levels in wild bilbies.

Parasitology ◽  
1948 ◽  
Vol 39 (1-2) ◽  
pp. 26-38 ◽  
Author(s):  
H. D. Crofton

1. Eggs and larvae of Trichostrongylus retortaeformis were used.2. The rate of hatching of eggs was shown to be mainly related to temperature. From November to March, when maximum temperatures were below 50° F., there was no hatching. When maximum temperatures of 50–55° F. occurred eggs hatched on or before the fifteenth day, but never during the first 8 days. Eggs hatched in 8 days or less when maximum temperatures of 60–80° F. occurred.3. When the rate of evaporation in the air was high, eggs still hatched and reached the infective stage, the grass blades reducing the rate of loss of moisture from the faecal pellet. Laboratory experiments show that eggs may not develop to the infective stage if the faecal pellets are on a grassless portion of the pasture. This is most likely to occur when the rate of evaporation is high and the temperature low.4. Hatching may be delayed by cold conditions, but some eggs remain viable for long periods and they hatch when the temperature rises. Eggs passed by the host in the autumn can survive a cold winter and hatch in the spring, but eggs passed during the coldest period die.5. During periods when the maximum temperature never exceeded 55° F., little or no migration of larvae occurred. When temperatures rose above 55° F. the number of larvae migrating increased; but rise of temperature was associated with increase in the rate of evaporation. High rates of evaporation reduced the number of larvae migrating on the grass blades.6. Some infective larvae died soon after exposure on grass plots, but a small number survived long periods. In cold weather some larvae were still alive after 20 weeks. A high death-rate occurred in warm weather. A large proportion of the larvae died during periods in which the rate of evaporation was high; in one of these periods 95% of the larvae were dead at the end of 4 weeks' exposure.7. The number of larvae on grass blades of a pasture was shown to be dependent, at any time, upon the climate at that time, and upon past conditions which had influenced hatching and survival:


Antibiotics ◽  
2020 ◽  
Vol 9 (12) ◽  
pp. 910
Author(s):  
Håkon Kaspersen ◽  
Anne Margrete Urdahl ◽  
Carl Andreas Grøntvedt ◽  
Stine Margrethe Gulliksen ◽  
Bereket Tesfamichael ◽  
...  

Norway has a favourable situation with regard to health status and antimicrobial usage in the pig production sector. However, one of the major disease-causing agents in the commercial pig population is Actinobacillus pleuropneumoniae (APP). In some herds, APP eradication has been performed by using enrofloxacin in combination with a partial herd depopulation. The aim of this study was to investigate the long-term effects of a single treatment event with enrofloxacin on the occurrence of quinolone resistant Escherichia coli (QREC). The study was designed as a retrospective case/control study, where the herds were selected based on treatment history. Faecal samples were taken from sows, gilts, fattening pigs and weaners for all herds where available. A semi-quantitative culturing method was used to identify the relative quantity of QREC in the faecal samples. A significant difference in overall occurrence and relative quantity of QREC was identified between the case and control herds, as well as between each animal age group within the case/control groups. The results indicate that a single treatment event with enrofloxacin significantly increased the occurrence of QREC in the herd, even years after treatment and with no subsequent exposure to quinolones.


1981 ◽  
Vol 8 (2) ◽  
pp. 237 ◽  
Author(s):  
GJE Hill

During May and July 1978 two faecal pellet surveys were conducted to attain indices of abundance for a stable population of grey kangaroos within a 33-km2 block of state forest in southern Queensland. The study area was divided into 25-ha cells, from which approximately one-fifth were selected by random means for survey. Each cell was sampled by two parallel transects 100 m apart. Along each transect 25 regularly spaced 0.001-ha circular plots were searched for faecal pellets within particular age ranges. Results were 4634 � 19% and 5071 � 19% pellets km-2 per day respectively (Y � SE). The two surveys displayed no significant variance in estimates of average density. Preliminary surveys produced no significant difference in estimates of faecal pellet density between plots of 0.001 and 0.0003 ha. This held true for counts of pellet totals and pellet group totals. Sampling efficiency was superior for the larger plot.


Holzforschung ◽  
2008 ◽  
Vol 62 (1) ◽  
pp. 99-111 ◽  
Author(s):  
Luisa M.S. Borges ◽  
Simon M. Cragg ◽  
Julien Bergot ◽  
John R. Williams ◽  
Ben Shayler ◽  
...  

AbstractThe marine borerLimnoriaingests essential wood components including the extractives the wood contains. Some extractives may confer borer resistance on certain timbers. Feeding byLimnoriacorrelates with the rate of production of faecal pellets. The faecal pellet production rate and mortality on over 40 test timbers and non-resistantPinus sylvestrissapwood was measured over 15 days. By placing animals in leachate from wood and with wood in flowing seawater, the effects of leaching-resistant and water-soluble compounds were measured. Some previously untested timbers affectedLimnoriaas strongly as timbers reputed for durability in marine construction. Wood ofMinquartia guianensis,Nectandra rubraandBruguiera gymnorhizacaused high mortality, and pellet production on them was less than 10% of production onP. sylvestris. Suppressed feeding rates, but with no heavy mortality, were observed on known durable species such asChlorocardium rodiei,Dicorynia guianensis, Lophira alataandNauclea trillesii, but also onCynometra ananta,Distemonanthus benthamianus,Enterolobium schomburgkii,Goupia glabra,Hymenaea courabil,Mammea africana,Shoreasp. andSacoglottis guianensis. Leachate fromB. gymnorhiza,G. glabra,H. coubaril,N. rubraandShoreasp. caused high mortality. These short-term bioassays thus detected clear differences between wood species in their resistance toLimnoriathat matched findings from long-term marine trials, while indicating new species worthy of detailed testing.


Problems when calculating reinforced concrete structures based on the concrete deformation under compression diagram, which is presented both in Russian and foreign regulatory documents on the design of concrete and reinforced concrete structures are considered. The correctness of their compliance for all classes of concrete remains very approximate, especially a significant difference occurs when using Euronorm due to the different shape and sizes of the samples. At present, there are no methodical recommendations for determining the ultimate relative deformations of concrete under axial compression and the construction of curvilinear deformation diagrams, which leads to limited experimental data and, as a result, does not make it possible to enter more detailed ultimate strain values into domestic standards. The results of experimental studies to determine the ultimate relative deformations of concrete under compression for different classes of concrete, which allowed to make analytical dependences for the evaluation of the ultimate relative deformations and description of curvilinear deformation diagrams, are presented. The article discusses various options for using the deformation model to assess the stress-strain state of the structure, it is concluded that it is necessary to use not only the finite values of the ultimate deformations, but also their intermediate values. This requires reliable diagrams "s–e” for all classes of concrete. The difficulties of measuring deformations in concrete subjected to peak load, corresponding to the prismatic strength, as well as main cracks that appeared under conditions of long-term step loading are highlighted. Variants of more accurate measurements are proposed. Development and implementation of the new standard GOST "Concretes. Methods for determination of complete diagrams" on the basis of the developed method for obtaining complete diagrams of concrete deformation under compression for the evaluation of ultimate deformability of concrete under compression are necessary.


2020 ◽  
Vol 132 (5) ◽  
pp. 1405-1413 ◽  
Author(s):  
Michael D. Staudt ◽  
Holger Joswig ◽  
Gwynedd E. Pickett ◽  
Keith W. MacDougall ◽  
Andrew G. Parrent

OBJECTIVEThe prevalence of trigeminal neuralgia (TN) in patients with multiple sclerosis (MS-TN) is higher than in the general population (idiopathic TN [ITN]). Glycerol rhizotomy (GR) is a percutaneous lesioning surgery commonly performed for the treatment of medically refractory TN. While treatment for acute pain relief is excellent, long-term pain relief is poorer. The object of this study was to assess the efficacy of percutaneous retrogasserian GR for the treatment of MS-TN versus ITN.METHODSA retrospective chart review was performed, identifying 219 patients who had undergone 401 GR procedures from 1983 to 2018 at a single academic institution. All patients were diagnosed with medically refractory MS-TN (182 procedures) or ITN (219 procedures). The primary outcome measures of interest were immediate pain relief and time to pain recurrence following initial and repeat GR procedures. Secondary outcomes included medication usage and presence of periprocedural hypesthesia.RESULTSThe initial pain-free response rate was similar between groups (p = 0.726): MS-TN initial GR 89.6%; MS-TN repeat GR 91.9%; ITN initial GR 89.6%; ITN repeat GR 87.0%. The median time to recurrence after initial GR was similar between MS-TN (2.7 ± 1.3 years) and ITN (2.1 ± 0.6 years) patients (p = 0.87). However, there was a statistically significant difference in the time to recurrence after repeat GR between MS-TN (2.3 ± 0.5 years) and ITN patients (1.2 ± 0.2 years; p < 0.05). The presence of periprocedural hypesthesia was highly predictive of pain-free survival (p < 0.01).CONCLUSIONSPatients with MS-TN achieve meaningful pain relief following GR, with an efficacy comparable to that following GR in patients with ITN. Initial and subsequent GR procedures are equally efficacious.


2021 ◽  
Vol 4 (Supplement_1) ◽  
pp. 234-236
Author(s):  
P Willems ◽  
J Hercun ◽  
C Vincent ◽  
F Alvarez

Abstract Background The natural history of primary sclerosing cholangitis (PSC) in children seems to differ from PSC in adults. However, studies on this matter have been limited by short follow-up periods and inconsistent classification of patients with autoimmune cholangitis (AIC) (or overlap syndrome). Consequently, it remains unclear if long-term outcomes are affected by the clinical phenotype. Aims The aims of this is study are to describe the long-term evolution of PSC and AIC in a pediatric cohort with extension of follow-up into adulthood and to evaluate the influence of phenotype on clinical outcomes. Methods This is a retrospective study of patients with AIC or PSC followed at CHU-Sainte-Justine, a pediatric referral center in Montreal. All charts between January 1998 and December 2019 were reviewed. Patients were classified as either AIC (duct disease on cholangiography with histological features of autoimmune hepatitis) or PSC (large or small duct disease on cholangiography and/or histology). Extension of follow-up after the age of 18 was done for patients followed at the Centre hospitalier de l’Université de Montréal. Clinical features at diagnosis, response to treatment at one year and liver-related outcomes were compared. Results 40 patients (27 PSC and 13 AIC) were followed for a median time of 71 months (range 2 to 347), with 52.5% followed into adulthood. 70% (28/40) had associated inflammatory bowel disease (IBD) (78% PSC vs 54% AIC; p=0.15). A similar proportion of patients had biopsy-proven significant fibrosis at diagnosis (45% PSC vs 67% AIC; p=0.23). Baseline liver tests were similar in both groups. At diagnosis, all patients were treated with ursodeoxycholic acid. Significantly more patients with AIC (77% AIC vs 30 % PSC; p=0.005) were initially treated with immunosuppressive drugs, without a significant difference in the use of Anti-TNF agents (0% AIC vs 15% PSC; p= 0.12). At one year, 55% (15/27) of patients in the PSC group had normal liver tests versus only 15% (2/13) in the AIC group (p=0.02). During follow-up, more liver-related events (cholangitis, liver transplant and cirrhosis) were reported in the AIC group (HR=3.7 (95% CI: 1.4–10), p=0.01). Abnormal liver tests at one year were a strong predictor of liver-related events during follow-up (HR=8.9(95% CI: 1.2–67.4), p=0.03), while having IBD was not (HR=0.48 (95% CI: 0.15–1.5), p=0.22). 5 patients required liver transplantation with no difference between both groups (8% CAI vs 15% CSP; p=0.53). Conclusions Pediatric patients with AIC and PSC show, at onset, similar stage of liver disease with comparable clinical and biochemical characteristics. However, patients with AIC receive more often immunosuppressive therapy and treatment response is less frequent. AIC is associated with more liver-related events and abnormal liver tests at one year are predictor of bad outcomes. Funding Agencies None


2021 ◽  
Vol 108 (Supplement_2) ◽  
Author(s):  
A Fung ◽  
A Ward ◽  
K Patel ◽  
M Krkovic

Abstract Introduction Infection is a major complication of open fractures. Antibiotic-impregnated calcium sulfate (AICS) beads are widely used as an adjuvant to systemic antibiotics. Whilst their efficacy in the secondary prevention of infection is established, we present the first retrospective study evaluating AICS beads in the primary prevention of infection in open fractures. Method 214 open femur and tibia fractures in 207 patients were reviewed over a seven-year period. 148 fractures received only systemic antibiotic prophylaxis. 66 fractures also received AICS beads. The occurrence of acute infection (wound infection and acute osteomyelitis) was recorded, as well as that of long-term complications (chronic osteomyelitis, non-union and death). Results Fractures that received AICS with systemic antibiotics had an overall acute infection rate of 42% (28/66), compared to 43% (63/148) in fractures that received only systemic antibiotics (p &gt; 0.05). There was no significant difference in infection rate even when fractures were stratified by Gustilo-Anderson grade. There was also no significant difference in the rate of long-term complications. Conclusions Our results indicate that the adjuvant use of AICS beads is not effective for the primary prevention of acute infection or long-term complications in open leg fractures. Further research is needed to elucidate the factors influencing the outcomes of AICS use.


2021 ◽  
Vol 7 (5) ◽  
pp. 369
Author(s):  
Joseph Cherabie ◽  
Patrick Mazi ◽  
Adriana M. Rauseo ◽  
Chapelle Ayres ◽  
Lindsey Larson ◽  
...  

Histoplasmosis is a common opportunistic infection in people with HIV (PWH); however, no study has looked at factors associated with the long-term mortality of histoplasmosis in PWH. We conducted a single-center retrospective study on the long-term mortality of PWH diagnosed with histoplasmosis between 2002 and 2017. Patients were categorized into three groups based on length of survival after diagnosis: early mortality (death < 90 days), late mortality (death ≥ 90 days), and long-term survivors. Patients diagnosed during or after 2008 were considered part of the modern antiretroviral therapy (ART) era. Insurance type (private vs. public) was a surrogate indicator of socioeconomic status. Out of 54 PWH infected with histoplasmosis, overall mortality was 37%; 14.8% early mortality and 22.2% late mortality. There was no statistically significant difference in survival based on the availability of modern ART (p = 0.60). Insurance status reached statistical significance with 38% of survivors having private insurance versus only 8% having private insurance in the late mortality group (p = 0.05). High mortality persists despite the advent of modern ART, implicating a contribution from social determinants of health, such as private insurance. Larger studies are needed to elucidate the role of these factors in the mortality of PWH.


Cancers ◽  
2021 ◽  
Vol 13 (14) ◽  
pp. 3390
Author(s):  
Mats Enlund

Retrospective studies indicate that cancer survival may be affected by the anaesthetic technique. Propofol seems to be a better choice than volatile anaesthetics, such as sevoflurane. The first two retrospective studies suggested better long-term survival with propofol, but not for breast cancer. Subsequent retrospective studies from Asia indicated the same. When data from seven Swedish hospitals were analysed, including 6305 breast cancer patients, different analyses gave different results, from a non-significant difference in survival to a remarkably large difference in favour of propofol, an illustration of the innate weakness in the retrospective design. The largest randomised clinical trial, registered on clinicaltrial.gov, with survival as an outcome is the Cancer and Anesthesia study. Patients are here randomised to propofol or sevoflurane. The inclusion of patients with breast cancer was completed in autumn 2017. Delayed by the pandemic, one-year survival data for the cohort were presented in November 2020. Due to the extremely good short-term survival for breast cancer, one-year survival is of less interest for this disease. As the inclusions took almost five years, there was also a trend to observe. Unsurprisingly, no difference was found in one-year survival between the two groups, and the trend indicated no difference either.


Sign in / Sign up

Export Citation Format

Share Document