scholarly journals Implementation of Two-Step Clostridioides difficile Testing Algorithm and Management of Possible Carriers

2020 ◽  
Vol 41 (S1) ◽  
pp. s269-s270
Author(s):  
J. Daniel Markley ◽  
Daniel Tassone ◽  
Melanie Christian ◽  
Leroy Vaughan ◽  
Michael P. Stevens ◽  
...  

Background: Updated IDSA-SHEA guidelines recommend different diagnostic approaches to C. difficile depending on whether There are pre-agreed institutional criteria for patient stool submission. If stool submission criteria are in place, nucleic acid amplification testing (NAAT) alone may be used. If not, a multistep algorithm is suggested, incorporating various combinations of toxin enzyme immunoassay (EIA), glutamate dehydrogenase (GDH), and NAAT, with discordant results adjudicated by NAAT. At our institution, we developed a multistep algorithm leading with NAAT with reflex to EIA for toxin testing if NAAT is positive. This algorithm resulted in a significant proportion of patients with discordant results (NAAT positive and toxin EIA negative) that some experts have categorized as possible carriers or C. difficile colonized. In this study, we describe the impact of a multistep algorithm on hospital-onset, community-onset, and healthcare-facility–associated C. difficile infection (HO-CDI, CO-CDI, and HFA-CDI, respectively) rates and the management of possible carriers. Methods: The study setting was a 399-bed, tertiary-care VA Medical Center in Richmond, Virginia. A retrospective chart review was conducted. The multistep C. difficile testing algorithm was implemented June 4, 2019 (Fig. 1). C. difficile testing results and possible carriers were reviewed for the 5 months before and 4 months after implementation (January 2019 to September 2019). Results: In total, 587 NAATs were performed in the inpatient and outpatient setting (mean, 58.7 per month). Overall, 123 NAATs (21%) were positive: 59 in the preintervention period and 63 in the postintervention period. In the postintervention period, 23 positive NAATs (26%) had a positive toxin EIA. Based on LabID events, the mean rate of HO+CO+HCFA CDI cases per 10,000 bed days of care (BDOC) decreased significantly from 9.49 in the preintervention period to 1.15 in the postintervention period (P = .019) (Fig. 2). Also, 9 of the possible carriers (22%) were treated for CDI based on high clinical suspicion, and 6 of the possible carriers (14%) had a previous history of CDI. Of these, 5 (83%) were treated for CDI. In addition, 1 patient (2%) converted from possible carrier to positive toxin EIA within 14 days. The infectious diseases team was consulted for 11 possible carriers (27%). Conclusions: Implementation of a 2-step C difficile algorithm leading with NAAT was associated with a lower rate of HO+CO+HCFA CDI per 10,000 BDOC. A considerable proportion (22%) of possible carriers were treated for CDI but did not count as LabID events. Only 2% of the possible carriers in our study converted to a positive toxin EIA.Funding: NoneDisclosures: None

2020 ◽  
Vol 41 (S1) ◽  
pp. s84-s84
Author(s):  
Lorinda Sheeler ◽  
Mary Kukla ◽  
Oluchi Abosi ◽  
Holly Meacham ◽  
Stephanie Holley ◽  
...  

Background: In December of 2019, the World Health Organization reported a novel coronavirus (severe acute respiratory coronavirus virus 2 [SARS-CoV-2)]) causing severe respiratory illness originating in Wuhan, China. Since then, an increasing number of cases and the confirmation of human-to-human transmission has led to the need to develop a communication campaign at our institution. We describe the impact of the communication campaign on the number of calls received and describe patterns of calls during the early stages of our response to this emerging infection. Methods: The University of Iowa Hospitals & Clinics is an 811-bed academic medical center with >200 outpatient clinics. In response to the coronavirus disease 2019 (COVID-19) outbreak, we launched a communications campaign on January 17, 2020. Initial communications included email updates to staff and a dedicated COVID-19 webpage with up-to-date information. Subsequently, we developed an electronic screening tool to guide a risk assessment during patient check in. The screening tool identifies travel to China in the past 14 days and the presence of symptoms defined as fever >37.7°C plus cough or difficulty breathing. The screening tool was activated on January 24, 2020. In addition, university staff contacted each student whose primary residence record included Hubei Province, China. Students were provided with medical contact information, signs and symptoms to monitor for, and a thermometer. Results: During the first 5 days of the campaign, 3 calls were related to COVID-19. The number of calls increased to 18 in the 5 days following the implementation of the electronic screening tool. Of the 21 calls received to date, 8 calls (38%) were generated due to the electronic travel screen, 4 calls (19%) were due to a positive coronavirus result in a multiplex respiratory panel, 4 calls (19%) were related to provider assessment only (without an electronic screening trigger), and 2 calls (10%) sought additional information following the viewing of the web-based communication campaign. Moreover, 3 calls (14%) were for people without travel history but with respiratory symptoms and contact with a person with recent travel to China. Among those reporting symptoms after travel to China, mean time since arrival to the United States was 2.7 days (range, 0–11 days). Conclusion: The COVID-19 outbreak is evolving, and providing up to date information is challenging. Implementing an electronic screening tool helped providers assess patients and direct questions to infection prevention professionals. Analyzing the types of calls received helped tailor messaging to frontline staff.Funding: NoneDisclosures: None


Author(s):  
Ghamar Bitar ◽  
Anthony Sciscione

Objective Despite lack of evidence to support efficacy, activity restriction is one of the most commonly prescribed interventions used for the prevention of preterm birth. We have a departmental policy against the use of activity restriction but many practitioners still prescribe it in an effort to prevent preterm birth. We sought to evaluate the rate and compliance of women who are prescribed activity restriction during pregnancy to prevent preterm birth. Study Design This was a single-site retrospective questionnaire study at a tertiary care, academic affiliated medical center. Women with a history of preterm delivery or short cervix were included. Once patients were identified, each patient was contacted and administered a questionnaire. We assessed the rates of activity restriction prescription and compliance. Secondary outcomes included details regarding activity restriction and treatment in pregnancy. Continuous variables were compared with t-test and categorical variables with Chi-square test. The value p < 0.05 was considered significant. Results Among the 52 women who responded to the questionnaire, 18 reported being placed on activity restriction by a physician, with 1 self-prescribing activity restriction, giving a rate of our primary outcome of 19 of 52 (36.5%). All women reported compliance with prescribed activity restriction (100%). Gestational age at delivery was not different in women placed on activity restriction. Conclusion This questionnaire suggests that approximately one in three high-risk women were placed on activity restriction during their pregnancy despite a departmental policy against its use. The 100% compliance rate in patients placed on activity restriction is a strong reminder of the impact prescribing patterns of physicians can have on patients. Key Points


Stroke ◽  
2021 ◽  
Vol 52 (Suppl_1) ◽  
Author(s):  
Jeffrey Quinn ◽  
Mohammad Hajighasemi ◽  
Laurie Paletz ◽  
Sonia Figueroa ◽  
Konrad Schlick

Introduction: Recrudescent symptoms of remote central nervous system lesions (primarily due to prior ischemic or hemorrhagic stroke) is a specific stroke mimic that is commonly in the differential diagnosis in patients presenting for emergent stroke evaluation. To date, best practices have yet to be established in terms of ensuring accurate diagnosis and the relative rates of causative systemic illnesses are not well described. We seek to better delineate the etiologies of recrudescent stroke symptoms seen at a tertiary care medical center via emergency stroke evaluation “Code Brain” (CB) as a first step towards clarifying diagnostic criteria for this entity. Methods: Data was obtained via retrospective chart review from consecutive patients via departmental database listing all CB consults seen at a tertiary care comprehensive stroke center in Los Angeles, California between the timeframe of January 2018- June 2020. Diagnoses for each case were adjudicated by faculty Vascular neurologists, in collaboration with Vascular neurology fellows and Neurology residents. Those cases with a diagnosis of stroke recrudescence were reviewed in detail for the extent of neuroimaging they underwent, as well as for identified causes of recrudescence. Results: Records of 3,998 consecutive CB activations were reviewed. 2.1% (n=85) were found after screening to have clinical diagnosis of recrudescence or chronic stroke. Of these 85 patients, 29.4% (n=25) were not found to have a causative etiology for recrudescent neurologic deficit. Of these 25 patients, 36.0% (n=9) did not undergo MRI to evaluate for interval ischemic lesion, as compared to 46.6% of those whom a causative etiology was identified. This difference (10.6%, 95% CI -12.30 to 30.67%, p=0.3719) was not significant. Discussion: At our comprehensive stroke center, recrudescent stroke is an uncommon diagnosis amongst all CB evaluations, despite being commonly considered. Despite a diagnosis of recrudescence, MRI brain is not always performed to rule out acute ischemic stroke. Standardized neuroimaging protocols should be considered in making the diagnosis of stroke recrudescence.


2006 ◽  
Vol 27 (9) ◽  
pp. 893-900 ◽  
Author(s):  
Ebbing Lautenbach ◽  
Mark G. Weiner ◽  
Irving Nachamkin ◽  
Warren B. Bilker ◽  
Angela Sheridan ◽  
...  

Objectives.To identify risk factors for infection with imipenem-resistant Pseudomonas aeruginosa and determine the impact of imipenem resistance on clinical and economic outcomes among patients infected with P. aeruginosa.Designs.An ecologic study, a case-control study, and a retrospective cohort study.Setting.A 625-bed tertiary care medical center.Patients.All patients who had an inpatient clinical culture positive for P. aeruginosa between January 1, 1999, and December 31, 2000.Results.From 1991 through 2000, the annual prevalence of imipenem resistance among P. aeruginosa isolates increased significantly (P<.001 by the χ2 test for trend). Among 879 patients infected with P. aeruginosa during 1999-2000, a total of 142 had imipenem-resistant P. aeruginosa infection (the case group), whereas 737 had imipenem-susceptible P. aeruginosa infection (the control group). The only independent risk factor for imipenem-resistant P. aeruginosa infection was prior fluoroquinolone use (adjusted odds ratio, 2.52 [95% confidence interval {CI}, 1.61-3.92]; P<.001). Compared with patients infected with imipenem-susceptible P. aeruginosa, patients infected with imipenem-resistant P. aeruginosa had longer subsequent hospitalization durations (15.5 days vs 9 days; P = .02) and greater hospital costs ($81,330 vs $48,381; P<.001). The mortality rate among patients infected with imipenem-resistant P. aeruginosa was 31.1%, compared with 16.7% for patients infected with imipenem-susceptible P. aeruginosa (relative risk, 1.86 [95% CI, 1.38-2.51]; P<.001). In multivariable analyses, there remained an independent association between infection with imipenem-resistant P. aeruginosa and mortality.Conclusions.The prevalence of imipenem resistance among P. aeruginosa strains has increased markedly in recent years and has had a significant impact on both clinical and economic outcomes. Our results suggest that curtailing use of other antibiotics (particularly fluoroquinolones) may be important in attempts to curb further emergence of imipenem resistance.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S806-S806
Author(s):  
Susan Nichols ◽  
Michelle D Jordan ◽  
Michael Coogan ◽  
Jackie Opera ◽  
Paul P Cook

Abstract Background Previous data at our facility indicated 37% of patients with Clostridium difficile infection (CDI) were receiving at least one laxative at the time of testing, suggesting the possibility of false-positive results. Nucleic acid amplification testing (NAAT) does not distinguish between colonization and infection with C. difficile. We implemented two interventions to address these issues and evaluated our rates of nosocomial CDI before and after these changes. Methods This was a retrospective study of all positive test results for adult patients with nosocomial C. difficile from October 1, 2017 through March 31, 2019 at Vidant Medical Center, a 911-bed hospital. In June, 2018, we implemented a best practice advisory (BPA) in our electronic health record to recommend against testing for CDI in patients receiving laxatives. We reviewed the number of C. difficile tests ordered before and after initiating the BPA. In December, 2018, we removed NAAT and replaced it with a cell cytotoxicity assay (CCA) for specimens that were enzyme immunoassay (EIA) negative and glutamate dehydrogenase (GDH) positive. Antimicrobial use was measured in days of therapy (DOT) per 10,000 patient-days (PD). Mann–Whitney U test was used for continuous variables. Linear regression was used to monitor antimicrobial use. Results The number of C. difficile tests ordered per month decreased 19.5% after implementing the BPA (P < 0.0001). There was a 44% reduction in the number of EIA+/GDH+ specimens per month after the BPA intervention (P = 0.003). Following substitution of CCA for NAAT for EIA-/GDH+ specimens, there was a 61% reduction in the rate of nosocomial CDI (8.6 cases/10,000 PD to 3.3 cases/10,000 PD; P = 0.005). Total antimicrobial use was unchanged over the course of the study (673 to 677 DOT/10,000 PD). Carbapenem use decreased 56% (P = 0.009); cefepime use increased 85%(p = 0.002); quinolone and clindamycin use were unchanged. Conclusion Laxative use in hospitalized patients is common and likely contributes to a false elevation in the CDI rate by identifying carriers in addition to those who have true infection. Implementing a BPA to reduce inappropriate testing and changing our testing algorithm for Clostridium difficile by substituting CCA for NAAT has resulted in a lower rate of nosocomial CDI. Disclosures All authors: No reported disclosures.


2019 ◽  
Vol 40 (3) ◽  
pp. 281-286 ◽  
Author(s):  
Satish Munigala ◽  
Rebecca Rojek ◽  
Helen Wood ◽  
Melanie L. Yarbrough ◽  
Ronald R. Jackups ◽  
...  

AbstractObjective:To evaluate the impact of changes to urine testing orderables in computerized physician order entry (CPOE) system on urine culturing practices.Design:Retrospective before-and-after study.Setting:A 1,250-bed academic tertiary-care referral center.Patients:Hospitalized adults who had ≥1 urine culture performed during their stay.Intervention:The intervention (implemented in April 2017) consisted of notifications to providers, changes to order sets, and inclusion of the new urine culture reflex tests in commonly used order sets. We compared the urine culture rates before the intervention (January 2015 to April 2016) and after the intervention (May 2016 to August 2017), adjusting for temporal trends.Results:During the study period, 18,954 inpatients (median age, 62 years; 68.8% white and 52.3% female) had 24,569 urine cultures ordered. Overall, 6,662 urine cultures (27%) were positive. The urine culturing rate decreased significantly in the postintervention period for any specimen type (38.1 per 1,000 patient days preintervention vs 20.9 per 1,000 patient days postintervention; P < .001), clean catch (30.0 vs 18.7; P < .001) and catheterized urine (7.8 vs 1.9; P < .001). Using an interrupted time series model, urine culture rates decreased for all specimen types (P < .05).Conclusions:Our intervention of changes to order sets and inclusion of the new urine culture reflex tests resulted in a 45% reduction in the urine cultures ordered. CPOE system format plays a vital role in reducing the burden of unnecessary urine cultures and should be implemented in combination with other efforts.


2020 ◽  
Vol 42 (1) ◽  
pp. 37-42
Author(s):  
Yana Shpunt ◽  
Inna Estrin ◽  
Yossef Levi ◽  
Hodaya Saadon ◽  
Galit Ben-Yossef ◽  
...  

AbstractObjective:Administration of antimicrobials to patients with asymptomatic bacteriuria (ASB) is a common error that can lead to worse outcomes. However, controlled analyses quantifying the commonality and impact of this practice are lacking. We analyzed the independent predictors for antimicrobials misuse in ASB and quantified the impact of this practice on clinical outcomes.Design:Retrospective case-control and cohort analyses for calendar year 2017.Setting:Tertiary-care, university-affiliated medical center.Patients:The study included adult (>18 years) patients with positive urine culture. Pregnant women, renal transplant recipients, and patients who underwent urologic procedures were excluded.Methods:ASB was determined according to US Centers for Disease Control and Prevention (CDC) criteria. Multivariable logistic regression models were constructed to analyze predictors and outcomes associated with antimicrobial use for patients with ASB.Results:The study included 1,530 patient-unique positive urine cultures. Among these patients, 610 patients (40%) were determined to have ASB. Of the 696 isolates, 219 (36%) were multidrug-resistant organisms (MDROs). Also, 178 (29%) patients received antimicrobials specifically due to the ASB. Independent predictors for improper administration of antimicrobials were dependent functional status (adjusted odds ratio [aOR], 2.3; 95% CI, 1.4–3.6) and male sex (aOR, 2; 95% CI, 1.25–2.6). Use of antimicrobials was independently associated with re-hospitalizations (aOR, 1.7; 95% CI, 1.1–2.6) and later, acute Clostridioides difficile infections (CDI) in the following 90 days (aOR, 4.5; 95% CI, 2–10.6).Conclusions:ASB is a common condition, frequently resulting from an MDRO. Male sex and poor functional status were independent predictors for mistreatment, and this practice was independently associated with rehospitalizations and CDI in the following 90 days.


2002 ◽  
Vol 23 (5) ◽  
pp. 254-260 ◽  
Author(s):  
Gregory Bisson ◽  
Neil O. Fishman ◽  
Jean Baldus Patel ◽  
Paul H. Edelstein ◽  
Ebbing Lautenbach

Objective:The incidence of extended-spectrum β-lactamase (ESβL)–mediated resistance has increased markedly during the past decade. Risk factors for colonization with ESβL-producingEscherichia coliand Klebsiella species(ESβL-EK) remain unclear, as do methods to control their further emergence.Design:Case–control study.Setting:Two hospitals within a large academic health system: a 725-bed academic tertiary-care medical center and a 344-bed urban community hospital.Patients:Thirteen patients with ESβL-EK fecal colonization were compared with 46 randomly selected noncolonized controls.Results:Duration of hospitalization was the only independent risk factor for ESβL-EK colonization (odds ratio, 1.11; 95% confidence interval, 1.02 to 1.21). Of note, 8 (62%) of the patients had been admitted from another healthcare facility. In addition, there was evidence for dissemination of a singleK. oxytocaclone. Finally, the prevalence of ESβL-EK colonization decreased from 7.9% to 5.7% following restriction of third-generation cephalosporins (P= .51).Conclusions:ESβL-EK colonization was associated only with duration of hospitalization and there was no significant reduction following antimicrobial formulary interventions. The evidence for nosocomial spread and the high percentage of patients with ESβL-EK admitted from other sites suggest that greater emphasis must be placed on controlling the spread of such organisms within and between institutions.


2018 ◽  
Vol 54 (2) ◽  
pp. 119-124
Author(s):  
Melissa Heim ◽  
Ryan Draheim ◽  
Anna Krupp ◽  
Paula Breihan ◽  
Ann O’Rourke ◽  
...  

Background: A multidisciplinary team updated an institution-specific pain, agitation, and delirium (PAD) guideline based on the recommendations from the Society of Critical Care Medicine (SCCM) PAD guidelines. This institution-specific guideline emphasized protocolized sedation with increased as needed boluses, and nonbenzodiazepine infusions, daily sedation interruption, and pairing of spontaneous awakening (SAT) and breathing trials (SBT). Objective: The purpose of this study was to evaluate the impact of implementation of a PAD guideline on clinical outcomes and medication utilization in an academic medical center intensive care unit (ICU). It was hypothesized that implementation of an updated guideline would improve clinical outcomes and decrease usage of benzodiazepine infusions. Methods: Pre-post retrospective chart review of 2417 (1147 pre, 1270 post) critically ill, mechanically ventilated adults in a medical/surgical ICU over a 2-year period (1 year pre and post guideline implementation). Results: After guideline implementation, average ventilation days was reduced (3.98 vs 3.43 days, P = .0021), as well as ICU and hospital length of stay (LOS) (4.79 vs 4.34 days, P = .048 and 13.96 vs 12.97 days, P = .045, respectively). Hospital mortality (19 vs 19%, P = .96) and acute physiology and chronic health evaluation (APACHE) IV scores (77.28 vs 78.75, P = .27) were similar. After guideline implementation, the percentage of patients receiving midazolam infusions decreased (422/1147 [37%] vs 363/1270 patients [29%], P = .0001). The percentage of patients receiving continuous infusion propofol (679/1147 [59%] vs 896/1270 [70%], P = .0001) and dexmedetomidine (78/1147 [7%] vs 147/1270 [12%], P = .0001) increased. Conclusions: Implementing a multidisciplinary PAD guideline utilizing protocolized sedation and daily sedation interruption decreased ventilation days and ICU and hospital LOS while decreasing midazolam drip usage.


2008 ◽  
Vol 139 (2_suppl) ◽  
pp. P148-P148
Author(s):  
John P Leonetti ◽  
Sam Marzo ◽  
Ryan G Porter

Objectives To present the clinical and radiographic findings in 8 patients with unusual primary tumors of the temporomandibular joint (TMJ). Methods This was a retrospective chart review of all patients with neurotologic manifestations caused by primary TMJ tumors seen at our tertiary care, academic medical center between 7/88-; 7/07. Results 8 patients were identified with primary TMJ tumors that caused a variety of neurotologic manifestations, including trismus, otalgia, tinnitus, hearing loss, aural fullness, headache, facial pain, and otorrhea. Tumor histology included chondroma, chondroblastoma, osteoma, giant cell tumor, synovial chondromatosis, and osteosarcoma. Anterior external auditory canal involvement required partial petrosectomy in all 8 patients. Despite aggressive surgical resection and radiotherapy, only 5 of 8 patients are disease-free, with a mean follow-up of 7.2 years. Conclusions Primary tumors of the TMJ, although rare, can cause a variety of common neurotologic manifestations. A complete physical examination with appropriate radiographic assessment will guide the proper treatment plan. Earlier diagnosis may lead to improved overall control rates.


Sign in / Sign up

Export Citation Format

Share Document