scholarly journals Implementation of the Targeted Assessment for Prevention Strategy in a healthcare system to reduce Clostridioides difficile infection rates

2020 ◽  
Vol 41 (3) ◽  
pp. 295-301
Author(s):  
Katelyn A. White ◽  
Minn M. Soe ◽  
Amy Osborn ◽  
Christie Walling ◽  
Lucy V. Fike ◽  
...  

AbstractBackground:Prevention of Clostridioides difficile infection (CDI) is a national priority and may be facilitated by deployment of the Targeted Assessment for Prevention (TAP) Strategy, a quality improvement framework providing a focused approach to infection prevention. This article describes the process and outcomes of TAP Strategy implementation for CDI prevention in a healthcare system.Methods:Hospital A was identified based on CDI surveillance data indicating an excess burden of infections above the national goal; hospitals B and C participated as part of systemwide deployment. TAP facility assessments were administered to staff to identify infection control gaps and inform CDI prevention interventions. Retrospective analysis was performed using negative-binomial, interrupted time series (ITS) regression to assess overall effect of targeted CDI prevention efforts. Analysis included hospital-onset, laboratory-identified C. difficile event data for 18 months before and after implementation of the TAP facility assessments.Results:The systemwide monthly CDI rate significantly decreased at the intervention (β2, −44%; P = .017), and the postintervention CDI rate trend showed a sustained decrease (β1 + β3; −12% per month; P = .008). At an individual hospital level, the CDI rate trend significantly decreased in the postintervention period at hospital A only (β1 + β3, −26% per month; P = .003).Conclusions:This project demonstrates TAP Strategy implementation in a healthcare system, yielding significant decrease in the laboratory-identified C. difficile rate trend in the postintervention period at the system level and in hospital A. This project highlights the potential benefit of directing prevention efforts to facilities with the highest burden of excess infections to more efficiently reduce CDI rates.

Author(s):  
Nicole C. Vissichelli ◽  
Christine M. Orndahl ◽  
Jane A. Cecil ◽  
Emily M. Hill ◽  
Matthew M. Hitchcock ◽  
...  

Abstract Objective: To determine whether cascade reporting is associated with a change in meropenem and fluoroquinolone consumption. Design: A quasi-experimental study was conducted using an interrupted time series to compare antimicrobial consumption before and after the implementation of cascade reporting. Setting: A 399-bed, tertiary-care, Veterans’ Affairs medical center. Participants: Antimicrobial consumption data across 8 inpatient units were extracted from the Center for Disease Control and Prevention (CDC) National Health Safety Network (NHSN) antimicrobial use (AU) module from April 2017 through March 2019, reported as antimicrobial days of therapy (DOT) per 1,000 days present (DP). Intervention: Cascade reporting is a strategy of reporting antimicrobial susceptibility test results in which secondary agents are only reported if an organism is resistant to primary, narrow-spectrum agents. A multidisciplinary team developed cascade reporting algorithms for gram-negative bacteria based on local antibiogram and infectious diseases practice guidelines, aimed at restricting the use of fluoroquinolones and carbapenems. The algorithms were implemented in March 2018. Results: Following the implementation of cascade reporting, mean monthly meropenem (P =.005) and piperacillin/tazobactam (P = .002) consumption decreased and cefepime consumption increased (P < .001). Ciprofloxacin consumption decreased by 2.16 DOT per 1,000 DP per month (SE, 0.25; P < .001). Clostridioides difficile rates did not significantly change. Conclusion: Ciprofloxacin consumption significantly decreased after the implementation of cascade reporting. Mean meropenem consumption decreased after cascade reporting was implemented, but we observed no significant change in the slope of consumption. cascade reporting may be a useful strategy to optimize antimicrobial prescribing.


Author(s):  
Amy L Pakyz ◽  
Christine M Orndahl ◽  
Alicia Johns ◽  
David W Harless ◽  
Daniel J Morgan ◽  
...  

Abstract Background The Centers for Medicare and Medicaid Services (CMS) implemented a core measure sepsis (SEP-1) bundle in 2015. One element was initiation of broad-spectrum antibiotics within 3 hours of diagnosis. The policy has the potential to increase antibiotic use and Clostridioides difficile infection (CDI). We evaluated the impact of SEP-1 implementation on broad-spectrum antibiotic use and CDI occurrence rates. Methods Monthly adult antibiotic data for 4 antibiotic categories (surgical prophylaxis, broad-spectrum for community-acquired infections, broad-spectrum for hospital-onset/multidrug-resistant [MDR] organisms, and anti–methicillin-resistant Staphylococcus aureus [MRSA]) from 111 hospitals participating in the Clinical Data Base Resource Manager were evaluated in periods before (October 2014–September 2015) and after (October 2015–June 2017) policy implementation. Interrupted time series analyses, using negative binomial regression, evaluated changes in antibiotic category use and CDI rates. Results At the hospital level, there was an immediate increase in the level of broad-spectrum agents for hospital-onset/MDR organisms (+2.3%, P = .0375) as well as a long-term increase in trend (+0.4% per month, P = .0273). There was also an immediate increase in level of overall antibiotic use (+1.4%, P = .0293). CDI rates unexpectedly decreased at the time of SEP-1 implementation. When analyses were limited to patients with sepsis, there was a significant level increase in use of all antibiotic categories at the time of SEP-1 implementation. Conclusions SEP-1 implementation was associated with immediate and long-term increases in broad-spectrum hospital-onset/MDR organism antibiotics. Antimicrobial stewardship programs should evaluate sepsis treatment for opportunities to de-escalate broad therapy as indicated.


Thorax ◽  
2017 ◽  
Vol 73 (3) ◽  
pp. 262-269 ◽  
Author(s):  
Omar Okasha ◽  
Hanna Rinta-Kokko ◽  
Arto A Palmu ◽  
Esa Ruokokoski ◽  
Jukka Jokinen ◽  
...  

IntroductionLimited data are available on population-level herd effects of infant 10-valent pneumococcal conjugate vaccine (PCV10) programmes on pneumonia. We assessed national trends in pneumococcal and all-cause pneumonia hospitalisations in adults aged ≥18 years, before and after infant PCV10 introduction in 2010.MethodsMonthly hospitalisation rates of International Statistical Classification of Diseases, 10th revision (ICD-10)-coded primary discharge diagnoses compatible with pneumonia from 2004–2005 to 2014–2015 were calculated with population denominators from the population register. Trends in pneumonia before and after PCV10 introduction were assessed with interrupted time-series analysis. Rates during the PCV10 period were estimated from adjusted negative binomial regression model and compared with those projected as continuation of the pre-PCV10 trend. All-cause hospitalisations were assessed for control purposes.ResultsBefore PCV10, the all-cause pneumonia rate in adults aged ≥18 years increased annually by 2.4%, followed by a 4.7% annual decline during the PCV10 period. In 2014–2015, the overall all-cause pneumonia hospitalisation rate was 109.3/100 000 (95% CI 96.5 to 121.9) or 15.4% lower than the expected rate. A significant 6.7% decline was seen in persons aged ≥65 years (131.5/100 000), which translates to 1456 fewer pneumonia hospitalisations annually. In comparison, hospitalisations other than pneumonia decreased by 3.5% annually throughout the entire study period.ConclusionThese national data suggest that herd protection from infant PCV10 programme has reversed the increasing trend and substantially decreased all-cause pneumonia hospitalisations in adults, particularly the elderly.


2020 ◽  
Vol 41 (S1) ◽  
pp. s105-s107
Author(s):  
Tracy Louis ◽  
Sandi Hyde

Background: Evidence-based best practices are available for the reduction and prevention of Clostridioides difficile infection (CDI). Often, these practices are not consistently followed in many inpatient care settings. A learning collaborative model resulted in a cost neutral, rapid, sustainable, statistically significant reduction in CDI events across an 88-hospital campus system without requiring hospitals to standardize laboratory methods, increase spending or increase staffing. Methods: In March 2018, a healthcare system with 88 critical access and community hospital campuses across 29 states participated in a harms-reduction learning collaborative. The collaborative format included educational webinars, gap analyses, action plans, and coaching calls facilitated by subject matter experts (SMEs). A collaborative cohort of 11 hospitals (55% rural*) was identified as having significant opportunity for improvement. These facilities participated in 3 monthly coaching calls. The coaching calls supported peer-to-peer sharing of practices and discussions of challenges and successes, and educational materials and presentations were provided by SMEs in pharmacy and infection prevention. Results: Statistically significant changes for the 88-hospital system as a whole: (1) 2018 compared to 2017: P < .001 (statistically significant); (2) 1H2018 compared to 2H2018 (before-and-after collaborative): P = .001; (3) 2019 compared to 2018: P < .001 (statistically significant). Statistically significant changes for the collaborative cohort: (1) 2018 compared to 2017: P < .001; (2) 1H2018 compared to 2H2018 (before-and-after collaborative): P = .002; and (3) 2019 compared to 2018: P < .001. We used 2-proportion, 2-tailed z-test for our analysis. Conclusions: Utilizing a learning collaborative model that included webinars, gap analyses, and interactive coaching calls, a cohort of 11 hospitals was able to induce rapid improvements to adherence of evidence-based practices resulting in a rapid, sustained, statistically significant improvement for both the cohort hospitals and the healthcare system.*2018 American community survey, US Census.Funding: NoneDisclosures: None


Author(s):  
María Cristina Hoyos ◽  
Doracelly Hincapié-Palacio ◽  
Jesus Ochoa ◽  
Alba León

Background: In Latin America, there are few studies of the impact of vaccination against diphtheria, tetanus, and pertussis. We estimate the impact of infant and maternal vaccination on the incidence of these diseases in Colombia.Design and methods: an interrupted time series study analyzing the incidence before and after of vaccination with DwPT (1975-2018) and with Tdap in pregnant women (2008-2018). A segmented regression model with negative binomial distribution estimated the change in level and trend of the predicted incidence ratio after vaccination in relation to the incidence if vaccination had not been started (IRR), using a Prais Winsten regression.Results: The pertussis IRR decreased immediately after the start of childhood vaccination (0.91, p=0.51), but this was only significant (1.01, p<0.001) along with the trend per year, after the start of maternal vaccination (0.98, p<0.001). In the absence of vaccination, the incidence would not have been reduced. Neonatal tetanus had the highest rate of change with significant reduction -1.69 - CI 95%: -2.91, -0.48). The trend after vaccination was the highest with an annual reduction of 19% (0.81, p=0.001). The change in incidence of diphtheria was significant, although slow (-0.02 - CI 95%: -0.04, -0.004). The sustained effect in the post-vaccination period was smaller (0.95, p=0.79).Conclusions: Childhood and maternal vaccination markedly reduced the incidence of pertussis and neonatal tetanus. It is necessary to maintain optimal vaccination coverage and surveillance, within an integrated elimination plan, which prevents the resurgence of these diseases.


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S110-S110
Author(s):  
Christina Maguire ◽  
Dusten T Rose ◽  
Theresa Jaso

Abstract Background Automatic antimicrobial stop orders (ASOs) are a stewardship initiative used to decrease days of therapy, prevent resistance, and reduce drug costs. Limited evidence outside of the perioperative setting exists on the effects of ASOs on broad spectrum antimicrobial use, discharge prescription duration, and effects of missed doses. This study aims to evaluate the impact of an ASO policy across a health system of adult academic and community hospitals for treatment of intra-abdominal (IAI) and urinary tract infections (UTI). ASO Outcome Definitions ASO Outcomes Methods This multicenter retrospective cohort study compared patients with IAI and UTI treated before and after implementation of an ASO. Patients over the age of 18 with a diagnosis of UTI or IAI and 48 hours of intravenous (IV) antimicrobial administration were included. Patients unable to achieve IAI source control within 48 hours or those with a concomitant infection were excluded. The primary outcome was the difference in sum length of antimicrobial therapy (LOT). Secondary endpoints include length and days of antimicrobial therapy (DOT) at multiple timepoints, all cause in hospital mortality and readmission, and adverse events such as rates of Clostridioides difficile infection. Outcomes were also evaluated by type of infection, hospital site, and presence of infectious diseases (ID) pharmacist on site. Results This study included 119 patients in the pre-ASO group and 121 patients in the post-ASO group. ASO shortened sum length of therapy (LOT) (12 days vs 11 days respectively; p=0.0364) and sum DOT (15 days vs 12 days respectively; p=0.022). This finding appears to be driven by a decrease in outpatient LOT (p=0.0017) and outpatient DOT (p=0.0034). Conversely, ASO extended empiric IV LOT (p=0.005). All other secondary outcomes were not significant. Ten patients missed doses of antimicrobials due to ASO. Subgroup analyses suggested that one hospital may have influenced outcomes and reduction in LOT was observed primarily in sites without an ID pharmacist on site (p=0.018). Conclusion While implementation of ASO decreases sum length of inpatient and outpatient therapy, it may not influence inpatient length of therapy alone. Moreover, ASOs prolong use of empiric intravenous therapy. Hospitals without an ID pharmacist may benefit most from ASO protocols. Disclosures All Authors: No reported disclosures


2020 ◽  
Vol 41 (S1) ◽  
pp. s401-s401
Author(s):  
Cindy Hou ◽  
Shannon Davila ◽  
Mary Miller ◽  
Ashlee Hiester ◽  
Katherine Hosmer ◽  
...  

Background: Infection preventionists (IPs) are the backbone of the quality and safety matrix of their organizations. Tools to help locate potential gaps can provide unique viewpoints from frontline staff. The CDC provides a Targeted Assessment for Prevention (TAP) strategy that identifies vulnerabilities in the prevention of healthcare-associated infection (HAIs). Methods: A statewide quality improvement organization, partnering with the CDC TAP team, administered TAP facility assessments for catheter-associated urinary tract infection (CAUTI), central-line–associated bloodstream infection (CLABSI), and Clostridioides difficile infection (CDI) to a collaborative of 15 acute-care and 2 long-term acute hospitals. More than 800 respondents filled out surveys based on their individualized perceptions of infection prevention practices. Results: The survey results yielded the following lagging indicators: lack of awareness of nursing and physician champions, need for competency-based training of clinical equipment, and feedback on device utilization. At the hospital system level, one improvement team focused on CDI, uncovered leading and lagging areas in general infrastructure, antibiotic stewardship, early detection and appropriate testing, contact precautions, and environmental cleaning. To culminate the TAP collaborative, the cohort of organizations, supported by interdisciplinary teams, participated in a full-day TAP workshop in which they reviewed detailed analyses of their HAI data and assessment results, shared best practices for infection prevention and planned for specific improvement projects using the plan-do-study-act model. Conclusions: Results of a statewide analysis of HAI prevention data and opportunities at a local level were reviewed. The TAP strategy can be used to target opportunities for improvement, to assess gaps in practice, and to develop and implement interventions for improving outcomes. Healthcare facilities and quality improvement organizations can drive infection prevention actions.Funding: NoneDisclosures: None


2020 ◽  
Vol 41 (S1) ◽  
pp. s439-s439
Author(s):  
Valerie Beck

Background: It is well known that contaminated surfaces contribute to the transmission of pathogens in healthcare settings, necessitating the need for antimicrobial strategies beyond routine cleaning with momentary disinfectants. A recent publication demonstrated that application of a novel, continuously active antimicrobial surface coating in ICUs resulted in the reduction of healthcare-associated infections. Objective: We determined the general microbial bioburden and incidence of relevant pathogens present in patient rooms at 2 metropolitan hospitals before and after application of a continuously active antimicrobial surface coating. Methods: A continuously active antimicrobial surface coating was applied to patient rooms in intensive care units (ICUs) twice over an 18-month period and in non-ICUs twice over a 6-month study period. The environmental bioburden was assessed 8–16 weeks after each treatment. A 100-cm2 area was swabbed from frequently touched areas in patient rooms: patient chair arm rest, bed rail, TV remote, and backsplash behind the sink. The total aerobic bacteria count was determined for each location by enumeration on tryptic soy agar (TSA); the geometric mean was used to compare bioburden before and after treatment. Each sample was also plated on selective agar for carbapenem-resistant Enterobacteriaceae (CRE), vancomycin-resistant enterococci (VRE), methicillin-resistant Staphylococcus aureus (MRSA), and Clostridioides difficile to determine whether pathogens were present. Pathogen incidence was calculated as the percentage of total sites positive for at least 1 of the 4 target organisms. Results: Before application of the antimicrobial coating, total aerobic bacteria counts in ICUs were >1,500 CFU/100 cm2, and at least 30% of the sites were positive for a target pathogen (ie, CRE, VRE, MRSA or C. difficile). In non-ICUs, the bioburden before treatment was at least 500 CFU/100 cm2, with >50% of sites being contaminated with a pathogen. After successive applications of the surface coating, total aerobic bacteria were reduced by >80% in the ICUs and >40% in the non-ICUs. Similarly, the incidence of pathogen-positive sites was reduced by at least 50% in both ICUs and non-ICUs. Conclusions: The use of a continuously active antimicrobial surface coating provides a significant (P < .01) and sustained reduction in aerobic bacteria while also reducing the occurrence of epidemiologically important pathogens on frequently touched surfaces in patient rooms. These findings support the use of novel antimicrobial technologies as an additional layer of protection against the transmission of potentially harmful bacteria from contaminated surfaces to patients.Funding: Allied BioScience provided Funding: for this study.Disclosures: Valerie Beck reports salary from Allied BioScience.


SLEEP ◽  
2021 ◽  
Vol 44 (Supplement_2) ◽  
pp. A141-A141
Author(s):  
Hrishikesh Kale ◽  
Rezaul Khandker ◽  
Ruchit Shah ◽  
Marc Botteman ◽  
Weilin Meng ◽  
...  

Abstract Introduction Use of benzodiazepines to treat insomnia has been associated with serious side effects and abuse potential. Insomnia patients are at high risk of opioid abuse and better sleep patterns may help to reduce opioid use. This study examined the trend in the use of benzodiazepines and prescription opioids before and after initiation of suvorexant in insomnia patients. Methods The study analyzed 2015–2019, Optum Clinformatics Data Mart. Insomnia patients, identified using ICD-9/10 codes and prescribed suvorexant were included. The study included incident (newly diagnosed) and prevalent cohorts of insomnia patients. The proportion of patients on benzodiazepines or prescription opioids were calculated for 12 monthly intervals before (pre-period) and after initiation of suvorexant (post-period). Interrupted time series (ITS) analysis was conducted to assess trends for use of benzodiazepine or prescription opioids over time. Results A total of 5,939 patients from the incident insomnia cohort and 18,920 from the prevalent cohort were included. For the incident cohort, mean age was 64.47 (SD: 15.48), 63% were females, 71% had Medicare Advantage coverage, 59% had Charlson comorbidity index score (CCI) ≥ 1, 27% had an anxiety disorder and 16% had substance abuse disorder. Prevalent insomnia cohort was similar but had higher CCI. Results from ITS suggested that at the beginning of the pre-period, 28% of incident insomnia patients used either opioids or benzodiazepines with the rate of use in the pre-period increasing by 0.11% per month. In the post-period, the rate of use decreased by 0.33% per month. About 26% patients used benzodiazepines or opioids at 12-month after suvorexant initiation. In the absence of suvorexant, this proportion would have been 31%. Similar findings were observed for the prevalent insomnia cohort. A larger decrease was observed for opioid use than benzodiazepines. Conclusion The rate of benzodiazepines or prescription opioid use decreased over time after the initiation of suvorexant. Suvorexant has the potential to reduce the use of opioids and benzodiazepines among insomnia patients. Further research is needed to confirm these findings. Support (if any) This study was sponsored by Merck Sharp & Dohme Corp., a subsidiary of Merck & Co., Inc., Kenilworth, NJ, USA.


2011 ◽  
Vol 56 (2) ◽  
pp. 989-994 ◽  
Author(s):  
C. Plüss-Suard ◽  
A. Pannatier ◽  
C. Ruffieux ◽  
A. Kronenberg ◽  
K. Mühlemann ◽  
...  

ABSTRACTThe original cefepime product was withdrawn from the Swiss market in January 2007 and replaced by a generic 10 months later. The goals of the study were to assess the impact of this cefepime shortage on the use and costs of alternative broad-spectrum antibiotics, on antibiotic policy, and on resistance ofPseudomonas aeruginosatoward carbapenems, ceftazidime, and piperacillin-tazobactam. A generalized regression-based interrupted time series model assessed how much the shortage changed the monthly use and costs of cefepime and of selected alternative broad-spectrum antibiotics (ceftazidime, imipenem-cilastatin, meropenem, piperacillin-tazobactam) in 15 Swiss acute care hospitals from January 2005 to December 2008. Resistance ofP. aeruginosawas compared before and after the cefepime shortage. There was a statistically significant increase in the consumption of piperacillin-tazobactam in hospitals with definitive interruption of cefepime supply and of meropenem in hospitals with transient interruption of cefepime supply. Consumption of each alternative antibiotic tended to increase during the cefepime shortage and to decrease when the cefepime generic was released. These shifts were associated with significantly higher overall costs. There was no significant change in hospitals with uninterrupted cefepime supply. The alternative antibiotics for which an increase in consumption showed the strongest association with a progression of resistance were the carbapenems. The use of alternative antibiotics after cefepime withdrawal was associated with a significant increase in piperacillin-tazobactam and meropenem use and in overall costs and with a decrease in susceptibility ofP. aeruginosain hospitals. This warrants caution with regard to shortages and withdrawals of antibiotics.


Sign in / Sign up

Export Citation Format

Share Document