Room Decontamination with UV Radiation

2010 ◽  
Vol 31 (10) ◽  
pp. 1025-1029 ◽  
Author(s):  
William A. Rutala ◽  
Maria F. Gergen ◽  
David J. Weber

Objective.To determine the effectiveness of a UV-C-emitting device to eliminate clinically important nosocomial pathogens in a contaminated hospital room.Methods.This study was carried out in a standard but empty hospital room (phase 1) and in a room previously occupied by a patient with methicillin-resistant Staphylococcus aureus (MRSA) or vancomycin-resistant Enterococcus (VRE) infection (phase 2) in an acute care tertiary hospital in North Carolina from January 21 through September 21, 2009. During phase 1, 8 × 8 cm Formica sheets contaminated with approximately 104-105 organisms of MRSA, VRE, multidrug-resistant (MDR) Acinetobacter baumannii, or Clostridium difficile spores were placed in a hospital room, both in direct line of sight of the UV-C device and behind objects. After timed exposure, the presence of the microbes was assessed. During phase 2, specific sites in rooms that had housed patients with MRSA or VRE infection were sampled before and after UV-C irradiation. After timed exposure, the presence of MRSA and VRE and total colony counts were assessed.Results.In our test room, the effectiveness of UV-C radiation in reducing the counts of vegetative bacteria on surfaces was more than 99.9% within 15 minutes, and the reduction in C. difficile spores was 99.8% within 50 minutes. In rooms occupied by patients with MRSA, UV-C irradiation of approximately 15 minutes duration resulted in a decrease in total CFUs per plate (mean, 384 CFUs vs 19 CFUs; P < .001), in the number of samples positive for MRSA (81 [20.3%] of 400 plates vs 2 [0.5%] of 400 plates; P < .001 ), and in MRSA counts per MRSA-positive plate (mean, 37 CFUs vs 2 CFUs; P < .001).Conclusions.This UV-C device was effective in eliminating vegetative bacteria on contaminated surfaces both in the line of sight and behind objects within approximately 15 minutes and in eliminating C. difficile spores within 50 minutes.

2018 ◽  
Vol 5 (suppl_1) ◽  
pp. S342-S343
Author(s):  
Hajime Kanamori ◽  
William Rutala ◽  
Maria Gergen ◽  
Emily Sickbert-Bennett ◽  
Deverick J Anderson ◽  
...  

Abstract Background Hospital room environmental surfaces can be contaminated with healthcare-associated pathogens even if terminal room cleaning/disinfection is implemented. We examined the microbiological burden on hospital room environmental sites after standard or enhanced terminal room disinfection. Methods Microbial data from the Benefits of Enhanced Terminal Room Disinfection Study were utilized. All patient rooms were randomly assigned to standard disinfection (Quaternary ammonium [Quat]) or an enhanced disinfection (Quat/ultraviolet light [UV-C], Bleach, or Bleach/UV-C). Microbiological samples were obtained using Rodac plates (25 cm2/plate) from 8 of 10 hospital room sites, including bed rail, over-bed table, supply/medicine cart, chair, side counter, linen hamper lid, sink, toilet seat, shower floor, and bathroom floor. The number of colony forming units (CFU) of four target epidemiologically important pathogens (EIP), including multidrug-resistant Acinetobacter, Clostridium difficile, methicillin-resistant Staphylococcus aureus, and vancomycin-resistant enterococci, was counted. A total of 3,680 samples from 736 environmental sites in all 92 patient rooms (21 standard rooms and 71 enhanced rooms) were analyzed. Results Overall, the frequency of all environmental sites positive for EIP was 11% (84/736) in all rooms, 21% (36/168) in standard rooms, and 8% (48/568) in enhanced rooms (P &lt; 0.001) (Figure 1). Environmental sites, other than the toilet seat, in standard rooms were likely to be more frequently contaminated with EIP than in enhanced rooms (P = 0.013 for overbed table, P = 0.010 for bed rail, and P &gt; 0.05 for other sites each). Mean CFU of EIP per room was 19.2 in all rooms, 60.8 in standard rooms, and 6.9 in enhanced rooms (P = 0.006) (Figure 2). All sites in standard rooms tended to have higher mean counts than in enhanced rooms (P = 0.001 for overbed table, P = 0.001 for bed rail, P = 0.012 for side counter, and P &gt; 0.05 for other sites each). Conclusion Our results demonstrate that an enhanced terminal room disinfection reduced microbial burden of healthcare-associated pathogens on environmental sites better than standard room disinfection. Environmental hygiene of touchable surfaces after terminal room cleaning using Quat needs to be improved. Disclosures W. Rutala, PDI: Consultant and Speaker’s Bureau, Consulting fee and Speaker honorarium. D. Weber, PDI: Consultant, Consulting fee.


2010 ◽  
Vol 31 (05) ◽  
pp. 491-497 ◽  
Author(s):  
Gonzalo Bearman ◽  
Adriana E. Rosato ◽  
Therese M. Duane ◽  
Kara Elam ◽  
Kakotan Sanogo ◽  
...  

Objective.To compare the efficacy of universal gloving with emollient-impregnated gloves with standard contact precautions for the control of multidrug-resistant organisms (MDROs) and to measure the effect on healthcare workers' (HCWs') hand skin health.Design.Prospective before-after trial.Setting.An 18-bed surgical intensive care unit.Methods.During phase 1 (September 2007 through March 2008) standard contact precautions were used. During phase 2 (March 2008 through September 2008) universal gloving with emollient-impregnated gloves was used, and no contact precautions. Patients were screened for vancomycin-resistantEnterococcus(VRE) and methicillin-resistantStaphylococcus aureus(MRSA). HCW hand hygiene compliance and hand skin health and microbial contamination were assessed. The incidences of device-associated infection andClostridium difficileinfection (CDI) were determined.Results.The rate of compliance with contact precautions (phase 1) was 67%, and the rate of compliance with universal gloving (phase 2) was 78% (P= .01). Hand hygiene compliance was higher during phase 2 than during phase 1 (before patient care, 40% vs 35% of encounters;P= .001; after patient care, 63% vs 51% of encounters;P&lt; .001). No difference was observed in MDRO acquisition. During phases 1 and 2, incidences of device-related infections, in number of infections per 1,000 device-days, were, respectively, 3.7 and 2.6 for bloodstream infection (P= .10), 8.9 and 7.8 for urinary tract infection (P= .10), and 1.0 and 1.1 for ventilator-associated pneumonia (P= .09). The CDI incidence in phase 1 and in phase 2 was, respectively, 2.0 and 1.4 cases per 1,000 patient-days (P= .53). During phase 1, 29% of HCW hand cultures were MRSA positive, compared with 13% during phase 2 (P= .17); during phase 1, 2% of hand cultures were VRE positive, compared with 0 during phase 2 (P= .16). Hand skin health improved during phase 2.Conclusions.Compared with contact precautions, universal gloving with emollient-impregnated gloves was associated with improved hand hygiene compliance and skin health. No statistically significant change in the rates of device-associated infection, CDI, or patient MDRO acquisition was observed. Universal gloving may be an alternative to contact precautions.


2013 ◽  
Vol 34 (5) ◽  
pp. 466-471 ◽  
Author(s):  
Deverick J. Anderson ◽  
Maria F. Gergen ◽  
Emily Smathers ◽  
Daniel J. Sexton ◽  
Luke F. Chen ◽  
...  

Objective.To determine the effectiveness of an automated ultraviolet-C (UV-C) emitter against vancomycin-resistant enterococci (VRE),Clostridium difficile, andAcinetobacterspp. in patient rooms.Design.Prospective cohort study.Setting.Two tertiary care hospitals.Participants.Convenience sample of 39 patient rooms from which a patient infected or colonized with 1 of the 3 targeted pathogens had been discharged.Intervention.Environmental sites were cultured before and after use of an automated UV-C-emitting device in targeted rooms but before standard terminal room disinfection by environmental services.Results.In total, 142 samples were obtained from 27 rooms of patients who were colonized or infected with VRE, 77 samples were obtained from 10 rooms of patients withC. difficileinfection, and 10 samples were obtained from 2 rooms of patients with infections due toAcinetobacter. Use of an automated UV-C-emitting device led to a significant decrease in the total number of colony-forming units (CFUs) of any type of organism (1.07 log10reduction;P< .0001), CFUs of target pathogens (1.35 log10reduction;P< .0001), VRE CFUs (1.68 log10reduction;P< .0001), and C.difficileCFUs (1.16 log10reduction;P< .0001). CFUs ofAcinetobacteralso decreased (1.71 log10reduction), but the trend was not statistically significantP= .25). CFUs were reduced at all 9 of the environmental sites tested. Reductions similarly occurred in direct and indirect line of sight.Conclusions.Our data confirm that automated UV-C-emitting devices can decrease the bioburden of important pathogens in real-world settings such as hospital rooms.


2016 ◽  
Vol 69 (3) ◽  
Author(s):  
Heather Neville ◽  
Larry Broadfield ◽  
Claudia Harding ◽  
Shelley Heukshorst ◽  
Jennifer Sweetapple ◽  
...  

<p><strong>ABSTRACT</strong></p><p><strong>Background: </strong>Pharmacy technicians are expanding their scope of practice, often in partnership with pharmacists. In oncology, such a shift in responsibilities may lead to workflow efficiencies, but may also cause concerns about patient risk and medication errors.</p><p><strong>Objectives: </strong>The primary objective was to compare the time spent on order entry and order-entry checking before and after training of a clinical support pharmacy technician (CSPT) to perform chemotherapy order entry. The secondary objectives were to document workflow interruptions and to assess medication errors.</p><p><strong>Methods: </strong>This before-and-after observational study investigated chemotherapy order entry for ambulatory oncology patients. Order entry was performed by pharmacists before the process change (phase 1) and by 1 CSPT after the change (phase 2); order-entry checking was performed by a pharmacist during both phases. The tasks were timed by an independent observer using a personal digital assistant. A convenience sample of 125 orders was targeted for each phase. Data were exported to Microsoft Excel software, and timing differences for each task were tested with an unpaired <em>t </em>test.</p><p><strong>Results: </strong>Totals of 143 and 128 individual orders were timed for order entry during phase 1 (pharmacist) and phase 2 (CSPT), respectively. The mean total time to perform order entry was greater during phase 1 (1:37 min versus 1:20 min; <em>p </em>= 0.044). Totals of 144 and 122 individual orders were timed for order-entry checking (by a pharmacist) in phases 1 and 2, respectively, and there was no difference in mean total time for order-entry checking (1:21 min versus 1:20 min; <em>p </em>= 0.69). There were 33 interruptions not related to order entry (totalling 39:38 min) during phase 1 and 25 interruptions (totalling 30:08 min) during phase 2. Three errors were observed during order entry in phase 1 and one error during order-entry checking in phase 2; the errors were rated as having no effect on patient care.</p><p><strong>Conclusions: </strong>Chemotherapy order entry by a trained CSPT appeared to be just as safe and efficient as order entry by a pharmacist. Changes in pharmacy technicians’ scope of practice could increase the amount of time available for pharmacists to provide direct patient care in the oncology setting.</p><p><strong>RÉSUMÉ</strong></p><p><strong>Contexte : </strong>Les techniciens en pharmacie élargissent leur champ de pratique, souvent en partenariat avec les pharmaciens. En oncologie, un tel changement dans les responsabilités pourrait conduire à une optimisation de l’organisation du travail, mais il peut aussi soulever des inquiétudes au sujet des risques pour le patient et des erreurs de médicaments.</p><p><strong>Objectifs : </strong>L’objectif principal était de comparer le temps passé à la saisie d’ordonnances et à la vérification de cette saisie avant et après avoir formé un technicien en pharmacie dédié au soutien clinique (TPDSC) à la saisie d’ordonnances de chimiothérapie. Les objectifs secondaires étaient de répertorier les interruptions de travail et d’évaluer les erreurs de médicaments.</p><p><strong>Méthodes : </strong>La présente étude observationnelle avant-après s’est intéressée à la saisie d’ordonnances de  chimiothérapie pour les patients ambulatoires en oncologie. La saisie d’ordonnances était réalisée par des pharmaciens avant le changement de procédé (phase 1), puis, après le changement (phase 2), un TPDSC en avait la responsabilité. Un pharmacien vérifiait la saisie d’ordonnances au cours des deux phases. Les tâches étaient chronométrées par un observateur indépendant à l’aide d’un assistant numérique personnel. Un échantillon de  commodité de 125 ordonnances était souhaité pour chaque phase. Les données ont été consignées dans un tableur Excel de Microsoft et les écarts de temps pour chaque tâche ont été évalués à l’aide d’un test <em>t </em>pour échantillons indépendants.</p><p><strong>Résultats : </strong>Au total, on a chronométré le temps de saisie pour 143 ordonnances à la phase 1 (pharmacien), puis de 128 ordonnances pour la phase 2 (TPDSC). Le temps total moyen nécessaire pour saisir une ordonnance était plus long au cours de la phase 1 (1 min 37 s contre 1 min 20 s; <em>p </em>= 0,044). Au total, on a chronométré la vérification (réalisée par un pharmacien) de saisie pour 144 ordonnances à la phase 1 et 122 ordonnances à la phase 2. Aucune différence notable n’a été relevée dans le temps total moyen de vérification (1 min 21 s contre 1 min 20 s; <em>p </em>= 0,69). On a dénombré 33 interruptions sans lien à la saisie d’ordonnances (totalisant 39 min 38 s) au cours de la phase 1 et 25 interruptions (totalisant 30 min et 8 s) durant la phase 2. Trois erreurs à la saisie d’ordonnances ont été observées pendant la phase 1 et une erreur à la vérification de la saisie d’ordonnances pendant la phase 2; ces erreurs ont été jugées sans effet sur les soins aux patients.</p><p><strong>Conclusions : </strong>La saisie d’ordonnances de chimiothérapie par un TPDSC formé semblait être tout aussi sûre et efficiente que si elle était réalisée par un pharmacien. Les changements apportés au champ de pratique des techniciens en pharmacie pourraient accroître le temps dont disposent les pharmaciens pour prodiguer des soins directs aux patients en oncologie.</p>


2016 ◽  
Vol 1 (2) ◽  
Author(s):  
Felix Michael Duerr ◽  
Ana Luisa Bascuñán ◽  
Nina Kieves ◽  
Clara Goh ◽  
Juliette Hart ◽  
...  

<p class="AbstractSummary"><strong>Objective: </strong>To evaluate inter- and intra-observer variability, influence of hair clipping and laser guidance on canine thigh circumference (TC) measurements amongst observers.<strong></strong></p><p class="AbstractSummary"><strong>Background:</strong> It was our goal to further study the reliability of canine TC measurements as currently performed. For this purpose we designed a cadaveric model that allows for controlled inflation of the thigh resembling increase of muscle mass. We also investigated the impact of novel technologies (laser guidance) and hair clipping on TC measurements in this model. </p><p class="AbstractSummary"><strong>Evidentiary value:</strong> Phase 1 cadaveric study - five long-haired, large breed canine cadavers; Phase 2 clinical study - eight clinically healthy Golden Retrievers. This study should impact clinical research and practice.</p><p class="AbstractSummary"><strong>Methods: </strong>Phase 1 - Canine cadaveric thigh girth was manually expanded to three different levels using a custom, submuscular inflation system before and after hair clipping; Phase 2 - TC of Golden Retrievers was measured with and without laser guidance. TC measurements for both phases were performed by four observers in triplicate resulting in a total of 552 measurements. </p><p class="AbstractSummary"><strong>Results:</strong> Phase 1 - TC measurements before and after hair clipping were significantly different (3.44cm difference, p&lt;0.001). Overall inter-observer and intra-observer variability were 2.26±1.18cm and 0.90±0.61cm, respectively. Phase 2 - Laser guidance nominally improved inter-observer variability (3.34 ±1.09cm versus 4.78 ±2.60cm) but did not affect intra-observer variability (1.14 ±0.66cm versus 1.13 ±0.77cm).</p><p class="AbstractSummary"><strong>Conclusion: </strong>TC measurement is a low fidelity outcome measure with a large inter- and intra-observer variability even under controlled conditions in a cadaveric setting. Current methods of canine TC measurement may not produce a valid outcome measurement. If utilised, hair coat clipping status should be considered and an intra-observer variability of at least 1cm should be assumed when comparing repeated TC measurements. Laser guidance may be helpful to nominally reduce inter-observer variability in settings with multiple observers. Further investigation of alternative methods for canine TC measurement should be pursued.<strong></strong></p><p class="AbstractSummary"><strong>Application:</strong> This information should be considered by everyone utilizing TC measurements as an outcome assessment for clinical or research purposes. </p><br /> <img src="https://www.veterinaryevidence.org/rcvskmod/icons/oa-icon.jpg" alt="Open Access" /> <img src="https://www.veterinaryevidence.org/rcvskmod/icons/pr-icon.jpg" alt="Peer Reviewed" />


PLoS ONE ◽  
2020 ◽  
Vol 15 (11) ◽  
pp. e0241804
Author(s):  
Dong Eun Lee ◽  
Hyun Wook Ryoo ◽  
Sungbae Moon ◽  
Jeong Ho Park ◽  
Sang Do Shin

Improving outcomes after out-of-hospital cardiac arrests (OHCAs) requires an integrated approach by strengthening the chain of survival and emergency care systems. This study aimed to identify the change in outcomes over a decade and effect of citywide intervention on good neurologic outcomes after OHCAs in Daegu. This is a before- and after-intervention study to examine the association between the citywide intervention to improve the chain of survival and outcomes after OHCA. The primary outcome was a good neurologic outcome, defined as a cerebral performance category score of 1 or 2. After dividing into 3 phases according to the citywide intervention, the trends in outcomes after OHCA by primary electrocardiogram rhythm were assessed. Logistic regression analysis was used to analyze the association between the phases and outcomes. Overall, 6203 patients with OHCA were eligible. For 10 years (2008–2017), the rate of survival to discharge and the good neurologic outcomes increased from 2.6% to 8.7% and from 1.5% to 6.6%, respectively. Especially for patients with an initial shockable rhythm, these changes in outcomes were more pronounced (survival to discharge: 23.3% in 2008 to 55.0% in 2017, good neurologic outcomes: 13.3% to 46.0%). Compared with phase 1, the adjusted odds ratio (AOR) and 95% confidence intervals (CI) for good neurologic outcomes was 1.20 (95% CI: 0.78–1.85) for phase 2 and 1.64 (1.09–2.46) for phase 3. For patients with an initial shockable rhythm, the AOR for good neurologic outcomes was 3.76 (1.88–7.52) for phase 2 and 5.51 (2.77–10.98) for phase 3. Citywide improvement was observed in the good neurologic outcomes after OHCAs of medical origin, and the citywide intervention was significantly associated with better outcomes, particularly in those with initial shockable rhythm.


2020 ◽  
Author(s):  
Changju Liao ◽  
Linghong Guo ◽  
Han Wang ◽  
Tengyong Wang ◽  
Yuyang Zhang ◽  
...  

Abstract Background: Falls are serious public health problems associated with irreversible health consequences and substantial economic burden. To effectively reduce the incidence of falls and mitigate fall-related injuries, we designed and verified a multifactorial fall intervention model.Methods: The current study was a longitudinal before-and-after controlled investigation including 3 phases with clinical characteristics of fall patients retrospectively identified in phase 1, a multifactorial fall intervention model designed in phase 2 and prospectively evaluated in phase 3. Phase 1 and 2 were conducted based on 153,601 hospitalized patients between January 2015 and December 2016. Phase 3 was carried out based on 171,776 hospitalized patients between January 2017 and December 2018. The Pearson Chi-squared test was used to compare categorical variables and the Mann-Whitney non-parametric test was utilized for one-way ordered data.Results: In phase 1, baseline characteristics of 491 fall patients revealed that inpatients falls were highly associated with the age, medication and disease. In phase 2, a new multifactorial fall intervention model covering measures for fall prevention, fall-onset management and continuous improvement was developed. Phase 3 recorded a total of 396 falls and demonstrated a remarkably declined fall rate (Reduction in falls by 0.09%, p<0.001) and fall rate per 1000 patient-days (Reduction in falls/1000 patient-days by 0.07‰, p<0.001) as compared with phase 1. The adjusted incident rate ratio of fall was 1.443 (95%CI: 1.263-1.647) (Phase 1 vs. Phase 3). Furthermore, the occurrence and the severity of fall injuries in phase 3 were significantly lower than that in phase 1 (Z=-4.426, p<0.001). More specifically, the number of uninjured falls accounted for 42.42% in phase 3 in comparison of 32.99% in phase 1.Conclusions: This multifactorial fall intervention model exhibited favorable effect on reducing the occurrence of fall and fall injuries.


2020 ◽  
Vol 25 (Supplement_2) ◽  
pp. e3-e3
Author(s):  
Michael Chang ◽  
Alicia Fernandes ◽  
Alexandra Frankel

Abstract Background Patients undergoing procedures at hospitals may experience anxiety and such anxiety can be heightened in pediatric populations. Anxiety can invoke a physical and mental stress response leading to poorer health outcomes and in children these outcomes include: resistance to treatment, nightmares, longer recovery periods, lowered pain thresholds and separation anxiety (Biddis, 2014; Manyande 2015; Aydın 2017). Objectives This study aimed to test whether a virtual reality intervention is feasible, beneficial and effective in reducing anxiety prior to surgery in the pediatric population of Scarborough Health Network. Design/Methods Virtual Reality (VR) is a computer technology that simulates a user’s physical presence in a virtual or imaginary environment. ‘Bubble Bloom’, an underwater fishing game where participants launch bubbles to catch colourful fish, is the VR game that the children are administered in a two phased research design to explore whether the VR intervention was a beneficial tool in reducing anxiety in our pediatric population. Phase 1 was a trial phase in which participants (n=20) were administered the condensed version of the State Trait Anxiety Scale before and after the intervention to determine if anxiety levels had been reduced. Participants were also administered an experience survey to explore patient satisfaction, headset comfort, and virtual reality satisfaction. Phase 2, randomized control trial, is currently ongoing with the same measures and VR intervention being administered. In Phase 2, participants are randomized to the control group (regular play activities) or intervention (virtual reality game). Results In Phase 1, all participants indicated they enjoyed the experience of the virtual reality intervention. Sixteen of the 20 participants had pre scores that were in the mild to moderate anxiety range (80%). Of these 16 participants, 10 participants’ post scores decreased to the normal or no anxiety range (63%). Additionally, 80% of participants demonstrated a reduction in anxiety post-virtual reality intervention. Conclusion Phase 1 results were encouraging with 80% of participants experiencing a reduction in anxiety and all participants enjoying the virtual reality experience.


2016 ◽  
Vol 54 (6) ◽  
pp. 1624-1630 ◽  
Author(s):  
Ruvandhi R. Nathavitharana ◽  
Doris Hillemann ◽  
Samuel G. Schumacher ◽  
Birte Schlueter ◽  
Nazir Ismail ◽  
...  

Less than 30% of multidrug-resistant tuberculosis (MDR-TB) patients are currently diagnosed, due to laboratory constraints. Molecular diagnostics enable rapid and simplified diagnosis. Newer-version line probe assays have not been evaluated against the WHO-endorsed Hain GenoType MTBDRplus(referred to as Hain version 1 [V1]) for the rapid detection of rifampin (RIF) and isoniazid (INH) resistance. A two-phase noninferiority study was conducted in two supranational reference laboratories to allow head-to-head comparisons of two new tests, Hain Genotype MTBDRplusversion 2 (referred to as Hain version 2 [V2]) and Nipro NTM+MDRTB detection kit 2 (referred to as Nipro), to Hain V1. In phase 1, the results for 379 test strains were compared to a composite reference standard that used phenotypic drug susceptibility testing (DST) and targeted sequencing. In phase 2, the results for 644 sputum samples were compared to a phenotypic DST reference standard alone. Using a challenging set of strains in phase 1, the values for sensitivity and specificity for Hain V1, Hain V2, and Nipro, respectively, were 90.3%/98.5%, 90.3%/98.5%, and 92.0%/98.5% for RIF resistance detection and 89.1%/99.4%, 89.1%/99.4%, and 89.6%/100.0% for INH resistance detection. Testing of sputa in phase 2 yielded values for sensitivity and specificity of 97.1%/97.1%, 98.2%/97.8%, and 96.5%/97.5% for RIF and 94.4%/96.4%, 95.4%/98.8%, and 94.9%/97.6% for INH. Overall, the rates of indeterminate results were low, but there was a higher rate of indeterminate results with Nipro than with Hain V1 and V2 in samples with low smear grades. Noninferiority of Hain V2 and Nipro to Hain V1 was demonstrated for RIF and INH resistance detection in isolates and sputum specimens. These results serve as evidence for WHO policy recommendations on the use of line probe assays, including the Hain V2 and Nipro assays, for MDR-TB detection.


2010 ◽  
Vol 108 (5) ◽  
pp. 1321-1335 ◽  
Author(s):  
J. M. Bonis ◽  
S. E. Neumueller ◽  
K. L. Krause ◽  
T. Kiner ◽  
A. Smith ◽  
...  

The objective of the present study was to test the hypothesis that, in the in vivo awake goat model, perturbation/lesion in the pontine respiratory group (PRG) would decrease the sensitivity to hypercapnia and hypoxia. The study reported herein was part of two larger studies in which cholinergic modulation in the PRG was attenuated by microdialysis of atropine and subsequently ibotenic acid injections neurotoxically lesioned the PRG. In 14 goats, cannula were bilaterally implanted into either the lateral ( n = 4) or medial ( n = 4) parabrachial nuclei or the Kölliker-Fuse nucleus (KFN, n = 6). Before and after cannula implantation, microdialysis of atropine, and injection of ibotenic acid, hypercapnic and hypoxic ventilatory sensitivities were assessed. Hypercapnic sensitivity was assessed by three 5-min periods at 3, 5, and 7% inspired CO2. In all groups of goats, CO2 sensitivity was unaffected ( P > 0.05) by any PRG perturbations/lesions. Hypoxic sensitivity was assessed with a 30-min period at 10.8% inspired O2. The response to hypoxia was typically triphasic, with a phase 1 increase in pulmonary ventilation, a phase 2 roll-off, and a phase 3 prolonged increase associated with shivering and increased metabolic rate and body temperature. In all groups of goats, the phase 1 of the hypoxic ventilatory responses was unaffected by any PRG perturbations/lesions, and there were no consistent effects on the phase 2 responses. However, in the KFN group of goats, the phase 3 ventilatory, shivering, metabolic rate, and temperature responses were markedly attenuated after the atropine dialysis studies, and the attenuation persisted after the ibotenic acid studies. These findings support an integrative or modulatory role for the KFN in the phase 3 responses to hypoxia.


Sign in / Sign up

Export Citation Format

Share Document