scholarly journals Antimicrobial Stewardship Standards and Patient Safety: A Case Study in Blood Culture Contamination

2021 ◽  
Vol 1 (S1) ◽  
pp. s36-s36
Author(s):  
Connie Schaefer

Background: Blood culture is a crucial diagnostic tool for healthcare systems, but false-positive results drain clinical resources, imperil patients with an increased length of stay (and associated hospital-acquired infection risk), and undermine global health initiatives when broad-spectrum antibiotics are administered unnecessarily. Considering emerging technologies that mitigate human error factors, we questioned historically acceptable rates of blood culture contamination, which prompted a need to promote and trial these technologies further. In a 3-month trial, 3 emergency departments in a midwestern healthcare system utilized an initial specimen diversion device (ISDD) to draw blood cultures to bring their blood culture contamination rate (4.4% prior to intervention) below the 3% benchmark recommended by the Clinical & Laboratory Standards Institute. Methods: All emergency department nursing staff received operational training on the ISDD for blood culture sample acquisition. From June through August 2019, 1,847 blood cultures were drawn via the ISDD, and 862 were drawn via the standard method. Results: In total, 16 contamination events occurred when utilizing the ISDD (0.9%) and 37 contamination events occurred when utilizing the standard method (4.3%). ISDD utilization resulted in an 80% reduction in blood culture contamination from the rate of 4.4% rate held prior to intervention. Conclusions: A midwestern healthcare system experienced a dramatic reduction in blood culture contamination across 3 emergency departments while pilot testing an ISDD, conserving laboratory and therapeutic resources while minimizing patient exposure to unnecessary risks and procedures. If the results obtained here were sustained and the ISDD utilized for all blood culture draws, nearly 400 contamination events could be avoided annually in this system. Reducing unnecessary antibiotic use in this manner will lower rates of associated adverse events such as acute kidney injury and allergic reaction, which are possible topics for further investigation. The COVID-19 pandemic has recently highlighted both the importance of keeping hospital beds available and the rampant carelessness with which broad-spectrum antibiotics are administered (escalating the threat posed by multidrug-resistant organisms). As more ambitious healthcare benchmarks become attainable, promoting and adhering to higher standards for patient care will be critical to furthering an antimicrobial stewardship agenda and to reducing treatment inequity in the field.Funding: NoDisclosures: None

2007 ◽  
Vol 25 (18_suppl) ◽  
pp. 9535-9535
Author(s):  
B. Kanathezhath ◽  
J. Feusner

9535 Background: Infections continue to be a major cause of morbidity and mortality in pediatric oncology patients (pts) with febrile neutropenia (FN). The proportion of such pts who have bacteremia documented after 72 hours (hrs) of broad-spectrum antibiotics, in the absence of local or systemic signs of infection, has not been previously reported. Methods: We conducted a retrospective analysis of all FN oncology pts admitted to our hospital during the period of August 1999 to October 2006. Blood cultures (BCs) from pts who were persistently febrile more than 3 days after initiation of empiric broad-spectrum antibiotics (ceftazidime and tobramycin) were analyzed. Medical records of pts with positive late blood cultures (LBCs) after 72 hrs were reviewed for onset of new signs and symptoms of infection. Hematopoietic stem cell transplant and HIV pts were excluded. Results: Ninety-seven episodes of persistent fever occurred in 71 FN pts. The total number of positive BCs in the first 72 hours was 24 (33.8%). Three (4.2%) of the persistently febrile pts had positive LBC. Of these 3 pts, one had preceding new signs and symptoms. Another had a probable contaminant (only 1 positive BC for coagulase-negative staphylococcus). Only one pt (1.4%) had positive LBC without any new local or systemic signs of infection. The observed frequency of positive LBC was 4.2% for pts and 0.8% (3/391) for total cultures obtained after 72 hours. There were no changes made in the antibiotic regimen of pts with positive LBC and none of them suffered from sepsis related mortality. Conclusions: This is the first report of late blood culture results in FN pediatric oncology pts. The practice of obtaining daily blood culture in such pts who are stable after 72 hrs of broad- spectrum antibiotics has a low yield (<5%), and even lower (<2%) if pts with new signs or symptoms at the LBC are excluded. This observation, if confirmed by larger studies from other centers, could lead to a more efficient, risk based strategy for following these pts. No significant financial relationships to disclose.


2021 ◽  
Vol 73 (6) ◽  
pp. 406-412
Author(s):  
Tharntip Sangsuwan ◽  
Rungtip Darayon ◽  
Silom Jamulitrat

Objective: To determine blood culture contamination rates, and display with a g-chart.Materials and Methods: The medical records of patients, from whom blood cultures were obtained in a university hospital, during January and December 2019 were retrieved and reviewed for contamination. The Center for Disease Control and Prevention (CDC) criteria were used to classify the blood culture results. The contamination rates were illustrated with a g-chart.Results: We identified 331 false-positive blood cultures, among 32,961 cultured specimens; yielding a contamination rate of 1.0% (95%CI = 0.9% – 1.1%). The highest contamination events occurred in the Emergency department (49.2%), Pediatric ICU (5.2%) and Neonatal ICU (4.8%), respectively. The most common commensal bacterial genus were Staphylococcus coagulase negative (67.1%), Bacillus spp. (10.2%) and Corynebacterium spp. (7.6%), correspondingly. The g-charts could identify 14 abnormal variations, in 41 locations.Conclusion: The contamination rates found were within ranges of other reports. G-charts are simple to construct, easy to interpret and sensitive for detection of real time epidemics.


2020 ◽  
Vol 41 (S1) ◽  
pp. s321-s321
Author(s):  
Stephanie Shealy ◽  
Joseph Kohn ◽  
Emily Yongue ◽  
Casey Troficanto ◽  
Brandon Bookstaver ◽  
...  

Background: Hospitals in the United States have been encouraged to report antimicrobial use (AU) to the CDC NHSN since 2011. Through the NHSN Antimicrobial Use Option module, health systems may compare standardized antimicrobial administration ratios (SAARs) across specific facilities, patient care locations, time periods, and antimicrobial categories. To date, participation in the NHSN Antimicrobial Use Option remains voluntary and the value of reporting antimicrobial use and receiving monthly SAARs to multihospital healthcare systems has not been clearly demonstrated. In this cohort study. we examined potential applications of SAAR within a healthcare system comprising multiple local hospitals. Methods: Three hospitals within Prisma Health-Midlands (hospitals A, B, and C) became participants in the NHSN Antimicrobial Use Option in July 2017. SAAR reports were presented initially in October 2017 and regularly (every 3–4 months) thereafter during interprofessional antimicrobial stewardship system-wide meetings until end of study in June 2019. Through interfacility comparisons and by analyzing SAAR categories in specific patient-care locations, primary healthcare providers and pharmacists were advised to incorporate results into focused antimicrobial stewardship initiatives within their facility. Specific alerts were designed to promote early de-escalation of antipseudomonal β-lactams and vancomycin. The Student t test was used to compare mean SAAR in the preintervention period (July through October 2017) to the postintervention period (November 2017 through June 2019) for all antimicrobials and specific categories and locations within each hospital. Results: During the preintervention period, mean SAAR for all antimicrobials in hospitals A, B, and C were 0.69, 1.09, and 0.60, respectively. Notably, mean SAARs at hospitals A, B, and C in intensive care units (ICU) during the preintervention period were 0.67, 1.36, and 0.83 for broad-spectrum agents used for hospital-onset infections and 0.59, 1.27, and 0.68, respectively, for agents used for resistant gram-positive infections. After antimicrobial stewardship interventions, mean SAARs for all antimicrobials in hospital B decreased from 1.09 to 0.83 in the postintervention period (P < .001). Mean SAARs decreased from 1.36 to 0.81 for broad-spectrum agents used for hospital-onset infections and from 1.27 to 0.72 for agents used for resistant gram-positive infections in ICU at hospital B (P = .03 and P = .01, respectively). No significant changes were noted in hospitals A and C. Conclusions: Reporting AU to the CDC NHSN and the assessment of SAARs across hospitals in a healthcare system had motivational effects on antimicrobial stewardship practices. Enhancement and customization of antimicrobial stewardship interventions was associated with significant and sustained reductions in SAARs for all antimicrobials and specific antimicrobial categories at those locations.Funding: NoneDisclosures: None


2021 ◽  
Vol 30 (1) ◽  
pp. 87-91
Author(s):  
Tamer Mohamed ◽  
Ashraf A Askar ◽  
Jamila Chahed

Background: Blood stream infections are major leading causes of morbidity and mortality in hospitalized patients. Increasing the awareness of the clinicians and nurses about the proper protocol of blood culture test is very important in reducing the contamination rate and the unnecessary requesting of blood culture. Objectives: to reduce the contamination rate and the unnecessary requesting of blood culture from different departments through implementation of hospital wide Quality Improvement Project (QIP). Methodology: Blood cultures were tested in the Microbiology Laboratory of Najran Armed Forces hospital, Saudi Arabia, in the period from June 2019 to July 2020 and their results were compared before and after the implementation of the QIP. Results: The comparison between the blood cultures results before and after QIP implementation showed statistically significant (19.6%) reduction in the contamination rate, (14%) reduction in the total number of blood culture requests and (11.6%) reduction in the negative results rate. Conclusion: The reduction in the total number, negative results and contamination rate of blood culture test after QIP implementation were considered as performance indicators that the recommendations of QIP were effective and implemented strictly.


2019 ◽  
Vol 152 (Supplement_1) ◽  
pp. S133-S133
Author(s):  
Kemin Xu ◽  
Sarwat Gilani ◽  
Hank Wang ◽  
John Fallon

Abstract Objectives Blood culture is one of the most important tests performed in clinical microbiology laboratories. However, blood culture contamination remains a problematic cause of diagnostic errors for laboratory diagnosis and patient management. This aim of this study was to determine blood culture contamination rates and tendency at Westchester Medical Center (WMC), a tertiary teaching hospital in suburban New York City. Methods All blood culture tests at WMC received from January 2017 to December 2018, as well as some historical data from 2007 to 2014, were retrospectively retrieved. Blood culture contamination rates were determined according to the laboratory’s predefined criteria. Results From 2007 to 2014, a total of 209,750 blood cultures were performed with an average contamination rate of 1.6% (ranging from 0.4% to 3.5% monthly). The total numbers of blood cultures performed in 2017 and 2018 were 27,863 and 28,047, respectively. The overall positive rate of blood culture was 6.8% in 2017 and 7.6% in 2018. The contamination rate of blood culture was 0.6% in 2017 and 0.9% in 2018 with very few variations among different months of the year, which was significantly lower than that of the national benchmark (~2.5%) on blood culture contamination. The majority of contaminants were Staphylococcus epidermidis, accounting for 87% of source contamination, followed by Corynebacterium species (5.5%), Bacillus species and Micrococcus species (3.5% each), and Propionibacterium species (0.5%). Conclusion Adherence to current guideline on appropriate blood collection techniques and monthly monitoring and timely feedback to phlebotomists should be continued to keep a low contamination rate for blood culture, which is not only from the perspective of individual patient care but also from the standpoint of hospital epidemiology and public health.


2020 ◽  
Vol 36 (1) ◽  
pp. 51-57
Author(s):  
Kevin G. Buell ◽  
Jonathan D. Casey ◽  
Michael J. Noto ◽  
Todd W. Rice ◽  
Matthew W. Semler ◽  
...  

Background: The optimal timing for the de-escalation of broad-spectrum antibiotics with activity against Pseudomonas aeruginosa and resistant Gram-negative rods (GNRs) in critically ill adults remains unknown. Research Question: We tested the hypothesis that cultures will identify GNRs that ultimately demonstrate resistance to ceftriaxone within 48 hours, potentially allowing safe de-escalation at this time point. Study Design and Methods: We conducted a secondary analysis of data from the Isotonic Solutions and Major Adverse Renal Events Trial: a pragmatic, cluster-randomized, multiple-crossover trial comparing balanced crystalloids versus saline for intravenous fluid administration in 15,802 critically ill adults at 5 intensive care units (ICUs) at Vanderbilt University Medical Center in Nashville, TN, USA. The primary endpoint was the time-to-positivity of respiratory and blood cultures that ultimately demonstrated growth of GNRs resistant to ceftriaxone. Multivariable logistic regression modeling was used to examine risk factors for the growth of cultures after 48 hours. Results: A total of 524 respiratory cultures had growth of GNRs, of which 284 (54.2%) had resistance to ceftriaxone. A total of 376 blood cultures grew GNRs, of which 70 (18.6%) had resistance to ceftriaxone. At 48 hours, 87% of respiratory cultures and 85% of blood cultures that ultimately grew GNRs resistant to ceftriaxone had demonstrated growth. Age, gender, predicted risk of inpatient mortality and prior use of antibiotics did not predict the growth of cultures after 48 hours. Interpretation: Among a cohort of critically ill adults, 13% of respiratory cultures and 15% of blood cultures that ultimately grew GNRs resistant to ceftriaxone did not demonstrate growth until at least 48 hours after collection. Further work is needed to determine the ideal time for critically ill adults to de-escalate from broad-spectrum antibiotics targeting Pseudomonas aeruginosa and extended-spectrum β-lactamase-producing gram-negative pathogens.


2020 ◽  
Vol 105 (9) ◽  
pp. e23.1-e23
Author(s):  
Orlagh McGarrity ◽  
Aliya Pabani

Introduction, Aims and ObjectivesIn 2011 the Start Smart then Focus campaign was launched by Public Health England (PHE) to combat antimicrobial resistance.1 The ‘focus’ element refers to the antimicrobial review at 48–72 hours, when a decision and documentation regarding infection management should be made. [OM1] At this tertiary/quaternary paediatric hospital we treat, immunocompromised, high risk patients. In a recent audit it was identified that 80% of antimicrobial use is IV, this may be due to several factors including good central access, centrally prepared IV therapy and oral agents being challenging to administer to children. The aim of the audit was to assess if patient have a blood culture prior to starting therapy, have a senior review at 48–72 hours, and thirdly if our high proportion of intravenous antimicrobial use is justified.MethodElectronic prescribing data from JAC was collected retrospectively over an 8 day period. IV antimicrobials for which there is a suitable oral alternative, this was defined as >80% bioavailability, were included. Patients were excluded in the ICU, cancer and transplant setting, those with absorption issues and with a high risk infection, such as endocarditis or bacteraemia. Patient were assessed against a set criteria to determine if they were eligible to switch from IV to PO therapy; afebrile, stable blood pressure, heart rate <90/min, respiratory rate < 20/min for 24 hours. Reducing CRP, reducing white cell count, blood cultures negative or sensitive to an antibiotic that can be given orally.Results100% of patients (11) had a blood cultures taken within 72 hours of starting therapy55% of patients had a positive blood culture82% of patients had a senior review at 48–72 hours46% of patients were eligible to switch from IV to PO therapy at 72 hours33% of eligible patients were switched from IV to PO therapy at 72 hoursConclusion and RecommendationsThis audit had a low sample size due to the complexity of the inclusion and exclusion criteria, and the difficulty in reviewing patient parameters on many different hospital interfaces. It is known that each patient is reviewed at least 24 hourly on most wards and therefore there is a need for improved documentation of prescribing decisions. Implementation of an IV to oral switch guideline is recommended to support prescribing decisions and educate and reassure clinicians on the bioavailability and benefits of PO antimicrobial therapy where appropriate. Having recently changed electronic patient management systems strategies to explore include hard stops on IV antimicrobial therapies, however this will require much consideration. Education of pharmacist and nurses is required to raise awareness about antimicrobial resistance and the benefits of IV to PO switches, despite the ease of this therapy at out Trust. This will promote a culture in which all healthcare professionals are active antimicrobial guardians, leading to better patient outcomes, less service pressures, and long term financial benefit.ReferenceGOV.UK. 2019. Antimicrobial stewardship: Start smart - then focus. [ONLINE]Available at: https://www.gov.uk/government/publications/antimicrobial-stewardship-start-smart-then-focus [Accessed 3 July 2019]


2021 ◽  
Vol 14 ◽  
pp. 73-76
Author(s):  
Blake Buzard ◽  
Patrick Evans ◽  
Todd Schroeder

Introduction: Blood cultures are the gold standard for identifying bloodstream infections. The Clinical and Laboratory Standards Institute recommends a blood culture contamination rate of <3%. Contamination can lead to misdiagnosis, increased length of stay and hospital costs, unnecessary testing and antibiotic use. These reasons led to the development of initial specimen diversion devices (ISDD). The purpose of this study is to evaluate the impact of an initial specimen diversion device on rates of blood culture contamination in the emergency department.  Methods: This was a retrospective, multi-site study including patients who had blood cultures drawn in an emergency department. February 2018 to April 2018, when an ISDD was not utilized, was compared with June 2019 to August 2019, a period where an ISDD was being used. The primary outcome was total blood culture contamination. Secondary outcomes were total hospital cost, hospital and intensive care unit length of stay, vancomycin days of use, vancomycin serum concentrations obtained, and repeat blood cultures obtained.  Results: A statistically significant difference was found in blood culture contamination rates in the Pre-ISDD group vs the ISDD group (7.47% vs 2.59%, p<0.001). None of the secondary endpoints showed a statistically significant difference. Conclusions: Implementation of an ISDD reduces blood culture contamination in a statistically significant manner. However, we were unable to capture any statistically significant differences in the secondary outcomes.


2021 ◽  
Vol 8 (Supplement_1) ◽  
pp. S432-S432
Author(s):  
Alexander G Hosse

Abstract Background Blood cultures are the gold standard for diagnosing bloodstream infections and a vital part of the work-up in systemic infections. However, contamination of blood cultures represents a significant burden on patients and the healthcare system with increased hospital length of stay, unnecessary antibiotics, and financial cost. The data discussed here offer insight into blood culture contamination rates before and through the COVID-19 pandemic at a community hospital and the processes that were affected by the pandemic. Methods Blood culture contaminations were determined by using the number of sets of blood cultures with growth and the presence of an organism from the National Healthcare Safety Network's (NHSN) commensal organism. Contamination rates were evaluated by status as a standard unit or a COVID-19 isolation unit in either the emergency department (ED) or inpatient floor units. The identified four groups had different processes for drawing blood cultures, particularly in terms of training of staff in use of diversion devices. The electronic medical record was used to track contaminations and the use of diversion devices in the different units. Results The inpatient COVID units were consistently elevated above the other units and the institutional contaminant goal of 2.25%, ranging from 9.6% to 13.3% from 4/2020-9/2020. Those units were the primary driver of the increase in overall contamination rates. COVID ED nursing staff (that had previously undergone training in the use of diversion devices) used diversion devices to draw 51 of 133 (38.3%) cultures compared to only 15 of 84 (17.9%) on the COVID inpatient units. Figure 1. Comparison of contamination rates in the ED vs the inpatient units from all campuses from September 2019 through September 2020. The blue line represents the hospital goal of 2.25% contamination rate. Solid lines represent total contamination rates including COVID isolation units whereas dotted lines represent units excluding COVID isolation units. Figure 2. Comparison of the non-COVID vs COVID isolation units in the emergency department and inpatient units. The red line represents the hospital goal of less than 2.25% for blood culture contamination rate. Table of Contaminants vs. Total Collected Blood Cultures in Each Unit by Month Figure 3. Raw data from Figure 2. Total blood culture contaminations from each unit by month compared to total blood culture collections from each unit by month. Conclusion Evaluation revealed that nursing staff with less training in blood culture collection, particularly the use of diversion devices, were the primary staff collecting blood cultures in the inpatient COVID units. The difference in training is felt to be the primary driver of the increase in contaminants in the inpatient COVID units. The marked increase in contaminations highlights the difficulties of maintaining quality control processes during an evolving pandemic and the importance of ongoing efforts to improve the quality of care. These findings demonstrate the importance of training and routine use of procedures to reduce contaminations even during. Disclosures All Authors: No reported disclosures


Author(s):  
Vinitha Alex ◽  
Trusha Nana ◽  
Vindana Chibabhai

Abstract Background: Community-onset bloodstream infection (CO-BSI) is associated with substantial morbidity and mortality. Knowledge of locally prevalent pathogens and antimicrobial susceptibility patterns can promptly guide appropriate empiric therapy and improve outcomes. Objectives: We sought to determine the epidemiology of CO-BSI, the blood culture positivity rate and the contamination rate. We also sought to establish appropriateness of current empiric antimicrobial therapy practices. Methods: We retrospectively analyzed blood cultures taken from January 2015 to December 2019 at the emergency departments (EDs) of a tertiary-care academic hospital in South Africa using extracted laboratory data. Results: The overall positivity rate of blood cultures taken at the EDs was 15% (95% confidence interval [CI], 0.15–0.16) and the contamination rate was 7% (95% CI, 0.06–0.07). Gram-positive bacteria predominated in the pediatric cohort: neonates, 52 (54%) of 96; infants, 57 (52%) of 109; older children, 63 (61%) of 103. Methicillin-susceptible Staphylococcus aureus was the predominant pathogen among older children: 30 (35%) of 85. Escherichia coli was the most common pathogen isolated among adults and the elderly: 225 (21%) of 1,060 and 62 (29%) of 214, respectively. Among neonates, the susceptibility of E. coli and Klebsiella pneumoniae to the combination of ampicillin and gentamicin was 17 (68%) of 25. Among adults, the susceptibility of the 5 most common pathogens to amoxicillin-clavulanate was 426 (78%) of 546 and their susceptibility to ceftriaxone was 481 (85%) of 565 (P = .20). The prevalence of methicillin-resistant S. aureus, extended-spectrum β-lactamase–producing and carbapenem-resistant Enterobacterales were low among all age groups. Conclusions: Review of blood culture collection techniques is warranted to reduce the contamination rate. High rates of resistance to currently prescribed empiric antimicrobial agents for CO-BSI warrants a re-evaluation of local guidelines.


Sign in / Sign up

Export Citation Format

Share Document