scholarly journals Management of Gram-Negative Bloodstream Infections in the Era of Rapid Diagnostic Testing: Impact With and Without Antibiotic Stewardship

2020 ◽  
Vol 7 (10) ◽  
Author(s):  
Kimberly C Claeys ◽  
Emily L Heil ◽  
Stephanie Hitchcock ◽  
J Kristie Johnson ◽  
Surbhi Leekha

Abstract Background Verigene Blood-Culture Gram-Negative is a rapid diagnostic test (RDT) that detects gram-negatives (GNs) and resistance within hours from gram stain. The majority of the data support the use of RDTs with antimicrobial stewardship (AMS) intervention in gram-positive bloodstream infection (BSI). Less is known about GN BSI. Methods This was a retrospective quasi-experimental (nonrandomized) study of adult patients with RDT-target GN BSI comparing patients pre-RDT/AMS vs post-RDT/pre-AMS vs post-RDT/AMS. Optimal therapy was defined as appropriate coverage with the narrowest spectrum, accounting for source and co-infecting organisms. Time to optimal therapy was analyzed using Kaplan-Meier and multivariable Cox proportional hazards regression. Results Eight-hundred thirty-two patients were included; 237 pre-RDT/AMS vs 308 post-RDT/pre-AMS vs 237 post-RDT/AMS, respectively. The proportion of patients on optimal antibiotic therapy increased with each intervention (66.5% vs 78.9% vs 83.2%; P < .0001). Time to optimal therapy (interquartile range) decreased with introduction of RDT: 47 (7.9–67.7) hours vs 24.9 (12.4–55.2) hours vs 26.5 (10.3–66.5) hours (P = .09). Using multivariable modeling, infectious diseases (ID) consult was an effect modifier. Within the ID consult stratum, controlling for source and ICU stay, compared with the pre-RDT/AMS group, both post-RDT/pre-AMS (adjusted hazard ratio [aHR], 1.34; 95% CI, 1.04–1.72) and post-RDT/AMS (aHR, 1.28; 95% CI, 1.01–1.64), improved time to optimal therapy. This effect was not seen in the stratum without ID consult. Conclusions With the introduction of RDT and AMS, both proportion and time to optimal antibiotic therapy improved, especially among those with an existing ID consult. This study highlights the beneficial role of RDTs in GN BSI.

2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S145-S145
Author(s):  
Rajiv G Amipara ◽  
Hana R Winders ◽  
Julie Ann Justo ◽  
P B Bookstaver ◽  
Joseph Kohn ◽  
...  

Abstract Background Importance of follow up blood cultures (FUBC) for Staphylococcus aureus bloodstream infections (BSI) is well known, but the role of FUBC in gram-negative BSI remains controversial. This retrospective cohort study examined the association between obtaining FUBC and mortality in patients with gram-negative BSI. Methods Adults with first episodes of community-onset monomicrobial BSI due to gram-negative bacilli hospitalized at Prisma Health-Midlands hospitals in Columbia, South Carolina, USA from January 1, 2010 to June 30, 2015 were identified. Patients who died or were discharged from hospital within 72 hours of collection of index blood culture were excluded to minimize impact of survival and selection biases on results, respectively. FUBC were defined as repeat blood cultures obtained between 24 and 96 hours from initial positive blood culture. Cox proportional hazards regression model was used to examine association between obtaining FUBC and 28-day all-cause mortality. Results Among 766 patients with gram-negative BSI, 219 (28.6%) had FUBC obtained and 15 of 219 (6.8%) FUBC were persistently positive. Overall, median age was 67 years, 438 (57%) were women, 457 (60%) had urinary source of infection, and 426 (56%) had BSI due to Escherichia coli. Mortality was significantly lower in patients who had FUBC obtained than in those who did not have FUBC (6.3% vs. 11.7%, log-rank p=0.03). Obtaining FUBC was independently associated with reduced mortality (hazards ratio [HR] 0.49, 95%CI: 0.25–0.90) after adjustments for age (HR 1.35 per decade, 95% CI: 1.13–1.61), cancer (HR 5.90, 95% CI: 3.53–9.84), Pitt bacteremia score (HR 1.38 per point, 95% CI: 1.26–1.50), and inappropriate empirical antimicrobial therapy (HR 2.37, 95% CI: 1.17–4.39). Conclusion Obtaining FUBC was associated with improved survival in hospitalized patients with gram-negative BSI. These observations are consistent with the results of recent publications from Italy and North Carolina supporting utilization of FUBC in the management of gram-negative BSI. Disclosures Julie Ann Justo, PharmD, MS, BCPS-AQ ID, bioMerieux (Speaker’s Bureau)TRC Healthcare (Speaker’s Bureau)


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S98-S99
Author(s):  
Kimberly C Claeys ◽  
Emily Heil ◽  
Nora Loughry ◽  
Sanjay Chainani ◽  
J Kristie Johnson ◽  
...  

Abstract Background Verigene Blood Culture Gram-Negative (VBC-GN) is a rapid diagnostic test (RDT) that can detect key GNs and resistance within hours from Gram-stain. Numerous studies have shown that RDTs in BSIs improve clinical outcomes, particularly with active antimicrobial stewardship (AMS) intervention. Little is known regarding outcomes in GN BSI without vs. with AMS intervention. Methods A retrospective three-part quasi-experimental study of adult patients with GN BSI from December 2014 to April 2018. VBC-GN was introduced September 2015 and AMS review was implemented October 2017. Antibiotics were appropriate if active in vitro against isolated GN. Optimal antibiotics were not overly broad, accounted for resistance, source of infection, and other infecting organisms. Comparisons were made using Chi-squared for nominal variables and Kaplan–Meier with log-rank for time to event analysis. Results In total, 772 patients met inclusion. The most common source was urinary (30.1%) and E. coli was the most common GN (37.9%). Infectious Disease consults increased with each group (50.6% vs. 67.9% vs. 81.8%, P < 0.001). More patients pre-RDT (37.36%) and RDT+AMS (35.6%) compared with RDT only (24.6%) were critically ill, P = 0.001. Optimal therapy was achieved in more patients in RDT-only (79%) and RDT+AMS (86%) groups compared with pre-RDT (66%), P < 0.001. More patients in the pre-RDT group (44.7%) were appropriately de-escalated compared with RDT only (31.6%) and RDT + AMS (38.7%), P = 0.026. Appropriate escalation occurred most often in the RDT-only group (39.3%) vs. pre-RDT (15.2%) and RDT + AMS (14.2%), P = 0.019. Median post-BSI length of stay (8.2 vs. 7.1 vs. 8.5 days, P = 0.226) and inpatient mortality (10.8% vs. 14.3% vs. 11.4%, P = 0.493) were similar. Conclusion With the implementation of VBC-GN RDT there was a significantly decreased time to optimal therapy, mainly based on necessary antibiotic escalation. Antibiotic de-escalation remained a challenge, even with active AMS review. Disclosures All authors: No reported disclosures.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S108-S108
Author(s):  
Napadol Siritip ◽  
Arkom Nongnuch ◽  
Thanate Dajsakdipon ◽  
Charat Thongprayoon ◽  
Wisit Cheungpasitporn ◽  
...  

Abstract Background Bloodstream infections (BSIs) are an important cause of morbidity and mortality among kidney transplant (KT) recipients, especially within the first year. We investigated for an epidemiology, risk factors and outcome of this specific infection following KT. Methods We conducted a retrospective study of all adult KT recipients who developed BSI within the first year after KT from January to December 2016 at a large referral single transplant center in Bangkok, Thailand. The cumulative incidence of BSI after transplant was estimated with Kaplan–Meier methodology. Clinical characteristics, microbiological data, and outcome were extracted. Risk factors for BSI were assessed with multivariate Cox proportional hazards models. Results A total of 26 (15.2%) episodes of BSI occurred in 171 KT recipients, 58.5% of them were men and the mean ± SD age was 43 ± 12 years. The majority received deceased-donor allograft (58.5%) and induction therapy (59%). The Kaplan–Meier estimated for BSIs were 12.3% at 3 months, 13.5% at 6 months, and 15.2% at 12 months after KT. Gram-negative bacteria were responsible for 92% of BSI, with Escherichia coli was the most common causative pathogen (65%) and 71% of those produced extended-spectrum β-lactamases enzyme. The genitourinary tracts were the predominant source of BSIs (85%). In a multivariate analysis, the second kidney transplantation [HR, 4.55; 95% CI, 1.24–16.79 (P =0.02)] and receiving induction therapy [HR, 3.05; 95% CI, 1.15–8.10 (P<0.03)] were associated with BSI. One patient (4%) developed acute cellular rejection and one patient (4%) died from septic shock. Conclusion One-sixth of KT recipients could develop gram-negative bloodstream infection within the first year after KT especially those underwent the second transplantation or received induction therapy. Disclosures All authors: No reported disclosures.


2015 ◽  
Vol 2015 ◽  
pp. 1-10 ◽  
Author(s):  
Tong-min Xue ◽  
Li-de Tao ◽  
Miao Zhang ◽  
Jie Zhang ◽  
Xia Liu ◽  
...  

miRNA-20b has been shown to be aberrantly expressed in several tumor types. However, the clinical significance of miRNA-20b in the prognosis of patients with hepatocellular carcinoma (HCC) is poorly understood, and the exact role of miRNA-20b in HCC remains unclear. The aim of the present study was to investigate the association of the expression of miR-20b with clinicopathological characteristics and overall survival of HCC patients analyzed by Kaplan-Meier analysis and Cox proportional hazards regression models. Meanwhile, the HIF-1αand VEGF targets of miR-20b have been confirmed. We found not only miR-20b regulation of HIF-1αand VEGF in normal but also regulation of miR-20b in hypoxia. This mechanism would help the tumor cells adapt to the different environments thus promoting the tumor invasion and development. The whole study suggests that miR-20b, HIF-1α, and VEGF serve as a potential therapeutic agent for hepatocellular carcinoma.


2019 ◽  
Vol 40 (Supplement_1) ◽  
Author(s):  
S Rosero ◽  
P Jones ◽  
I Goldenberg ◽  
W Zareba ◽  
K Stein ◽  
...  

Abstract Background The role of cardiovascular implantable electronic device (CIED)-derived activity to predict inappropriate implantable cardioverter-defibrillator (ICD) therapy is not known. The Multicenter Automatic Defibrillator Implantation Trial – Reduce Inappropriate Therapy (MADIT-RIT) enrolled 1500 patients with contemporary indication for an ICD or a CRT-D. We aimed to identify whether activity, as a digital biomarker, predicted inappropriate therapy. Methods In 1500 patients enrolled in MADIT-RIT, CIED-derived patient activity was acquired daily. CIED-derived activity was averaged for the first 30 days following randomization and utilized in this study to predict inappropriate therapy post- 30-day. Kaplan-Meier survival analysis and multivariate Cox proportional hazards regression models were used to evaluate first inappropriate therapy by 30-day CIED-derived patient activity quintiles, and by 30-day device derived patient activity as a continuous measurement. Results There were a total of 1463 patients with activity data available (90%), 135 patients received at least one inappropriate therapy during the post-30 day follow-up period. Patients in the highest quintile (Q5) of CIED-derived activity (more active) were younger, more often males and more likely to have had a prior ablation of an atrial arrhythmia. Patients in the highest quintile of 30-day CIED-derived median activity had the highest risk of receiving inappropriate therapy, 21% at 2 years as compared 7–11% in the other four quintiles (Figure, p<0.001 for the overall duration). Patients with the highest level of 30-day median patient activity (Q5) had 1.75 times higher risk of any inappropriate therapy as compared with lower levels of activity, Q1-Q4 (HR=1.75, 95% CI: 1.23–2.50, p<0.002). Each 10% increase in CIED-derived 30-day median patient activity was associated with a significant, 73% increase in risk of receiving inappropriate therapy (HR=1.73, 95% CI: 1.17–2.54, p=0.005). Patients in the highest quintile for activity had a 68% increase in the risk of SVT excluding atrial fibrillation, atrial flutter or atrial tachycardia (HR=1.69, 95% CI: 1.26–2.25, p=0.004), despite 96% receiving beta-blocker medications. Inappropriate ICD Therapies by Activity Conclusions CIED-derived 30-day median patient activity predicted subsequent inappropriate therapy in ICD and CRT-D patients enrolled in MADIT-RIT. Patients with high levels of 30-day CIED-derived median patient activity were at a significantly higher risk of receiving inappropriate therapy. Activity, as a digital biomarker, may have utility in predicting and managing the risk of inappropriate therapy in this population. Acknowledgement/Funding Boston Scientific


Cancers ◽  
2019 ◽  
Vol 11 (6) ◽  
pp. 879 ◽  
Author(s):  
Mark Antkowiak ◽  
Ahmed Gabr ◽  
Arighno Das ◽  
Rehan Ali ◽  
Laura Kulik ◽  
...  

Introduction: We compared the efficacy of the ALBI (albumin–bilirubin) score to the established Child–Pugh (CP) grade in hepatocellular carcinoma (HCC) patients treated with yttrium-90 radioembolization (Y90). We further assessed the individual contributions of albumin and bilirubin to survival prediction. Methods: 1000 consecutive HCC patients treated with Y90 were included. Overall survival (OS) was assessed using Kaplan Meier analysis. Sub-stratification analyses were performed using CP and ALBI and in subgroups determined by United Network for Organ Sharing (UNOS) or Barcelona Clinic Liver Cancer (BCLC) staging. The independent impact (hazard ratio (HR)) of ALBI, CP, albumin, and bilirubin on survival was assessed using Cox proportional hazards analysis. Results: Median OS for ALBI 1, 2, and 3 grades was 46.7, 19.1, and 8.8 months, respectively. The HR for death for ALBI 2 vs. ALBI 1 was 3.39 (1.75–6.57); ALBI 3 vs. ALBI 1 was 7.58 (3.89–14.79); and the c-index was 0.623. Median OS for CP A, B, and C was 21.7, 11.3, and 6.0 months, respectively. The HR for death for CP B vs. CP A was 2.04 (1.71–2.43); CP C vs. CP A was 3.27 (2.08–5.14); and the c-index was 0.616. Stratified OS showed unique prognostic groups identified by ALBI within CP-B and CP-C. Median OS for albumin grades 1, 2, and 3 was 46.0, 17.1, and 9.1 months, respectively. Median OS for bilirubin grades 1, 2, and 3 was 15.6, 21.0, and 5.8 months, respectively. The HR for death for albumin 2 vs. 1 was 2.48 (1.81–3.41); albumin 3 vs. 1 was 4.74 (3.44–6.54); and the c-index was 0.640. The HR for death for bilirubin 2 vs. 1 was 1.09 (0.82–1.44); bilirubin 3 vs. 1 was 2.37 (1.66–3.40); and the c-index was 0.533. Conclusions: ALBI outperforms CP in survival prognosis in Y90 treated patients. On sub-analyses, serum albumin (not bilirubin) appears to be the main driver of survival prediction. Our study supports the prognostic ability of ALBI and may suggest a role of albumin alone as a biomarker for patients with HCC.


Author(s):  
Alain Putot ◽  
Kevin Bouiller ◽  
Caroline Laborde ◽  
Marine Gilis ◽  
Amélie Févre ◽  
...  

Abstract Background It is uncertain whether antibiotic therapy should be started in SARS CoV-2 pneumonia. We aimed to investigate the association between early antibiotic therapy and the risk of in-hospital mortality in older patients. Methods We performed a retrospective international cohort study (ANTIBIOVID) in five COVID-19 geriatric units in France and Switzerland. Among 1,357 consecutive patients aged 75 or more hospitalised and testing positive for SARS-CoV-2, 1072 had a radiologically confirmed pneumonia, of which 914 patients were still alive and hospitalized at 48 hours. To adjust for confounders, a propensity score for treatment was created, and stabilized inverse probability of treatment weighting (SIPTW) was applied. To assess the association between early antibiotic therapy and in-hospital 30-day mortality, SIPTW-adjusted Kaplan-Meier and Cox proportional hazards regression analyses were performed. Results Of the 914 patients with SARS-CoV-2 pneumonia, median age of 86, 428 (46.8%) received antibiotics in the first 48 hours after diagnosis. Among these patients, 147 (34.3%) died in hospital within one month vs 118 patients (24.3%) with no early antibiotic treatment. After SIPTW, early antibiotic treatment was not significantly associated with mortality (adjusted hazard ratio, 1.23; 95% CI, 0.92-1.63; P = .160). Microbiologically confirmed superinfections occurred rarely in both groups (bacterial pneumonia: 2.5% vs 1.5%, P = .220; blood stream infection: 8.2% vs 5.2%, P = .120; Clostridioides difficile colitis: 2.4% vs 1.0%, P = .222). Conclusions In a large multicentre cohort of older inpatients with SARS-CoV-2 pneumonia, early antibiotic treatment did not appear to be associated with an improved prognosis.


2021 ◽  
Author(s):  
Quanhui Liao ◽  
Shaoxin Shen ◽  
Xijing Ma ◽  
Guisen Dai ◽  
Geng Lu ◽  
...  

Abstract Background and objectives The purpose of the present study was to comprehensively analyze the prognostic value of adjuvant chemotherapy (CT) in stage IV HCC patients. Methods HCC patients were recognized in the Surveillance, Epidemiology and End Results (SEER) database. The effects of adjuvant CT on HCC patients were evaluated by Kaplan–Meier curves and multivariable Cox proportional hazards analyses. Results A total of 490 HCC patients were enrolled in this study and the median follow-up time was 2.69 months (range: 0–102 months). 34.3% (168) HCC patients received adjuvant CT, of which 58.6% (287) received local destruction, 25.5% (125) were partial resection and 15.9% (78) underwent liver transplantion. Multivariate analysis showed that chemotherapy (P <0.001), surgery (P <0.001), year at diagnosis (P = 0.004), grade (P <0.001) and fibrosis score (P = 0.039) were independent factor of cancer specific survival (CSS), and that chemotherapy (P <0.001), surgery (P <0.001), year at diagnosis (P = 0.005), grade (P <0.001) were independent factor of overall survival (OS). Survival curves confirmed that patients achieved an increased OS or CSS from adjuvant CT (P <0.05). Conclusions Our results concluded that compared to surgery alone, stage IV HCC patients could profit from adjuvant chemotherapy. High quality prospective trials are necessary to further confirm our results.


2015 ◽  
Vol 25 (6) ◽  
pp. 1031-1036 ◽  
Author(s):  
Tolga Tasci ◽  
Alper Karalok ◽  
Salih Taskin ◽  
Isin Ureyen ◽  
Gunsu Kimyon ◽  
...  

IntroductionThe role of lymphadenectomy in the management of uterine leiomyosarcoma (LMS) is controversial. We aimed to identify whether lymph node dissection (LND) has any survival benefit in uterine LMS.MethodsData of 95 patients with histologically proven uterine LMS from 2 tertiary centers (1993 through 2009) were retrospectively analyzed. Kaplan-Meier and Cox proportional hazards regression models were used for analyses.ResultsMean age was 51.5 years. Thirty-six (37.9%) underwent LND. The median lymph node count was 54. Eight (22.2%) patients had lymphatic metastasis. Median follow-up was 26 months. Sixty-two (65%) patients had recurrence and 48 (50.5%) died. Median disease-free survival (DFS) was 19 months for both group of patients who had or did not have LND, and median overall survival (OS) was 29 and 26 months, respectively (P= 0.4). Five-year DFS was 35.9% vs 26.8% (P= 0.4), and 5-year OS was 45.4% vs 43.8% (P= 0.22) for the groups. Multivariate analyses did not reveal a single independent prognostic factor in respect to DFS or OS.ConclusionHigher rate of lymph node metastasis in patients with extrauterine disease indicated the importance of LND in LMS. However, the survival benefit of lymphadenectomy could not be shown.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
I.D Poveda Pinedo ◽  
I Marco Clement ◽  
O Gonzalez ◽  
I Ponz ◽  
A.M Iniesta ◽  
...  

Abstract Background Previous parameters such as peak VO2, VE/VCO2 slope and OUES have been described to be prognostic in heart failure (HF). The aim of this study was to identify further prognostic factors of cardiopulmonary exercise testing (CPET) in HF patients. Methods A retrospective analysis of HF patients who underwent CPET from January to November 2019 in a single centre was performed. PETCO2 gradient was defined by the difference between final PETCO2 and baseline PETCO2. HF events were defined as decompensated HF requiring hospital admission or IV diuretics, or decompensated HF resulting in death. Results A total of 64 HF patients were assessed by CPET, HF events occurred in 8 (12.5%) patients. Baseline characteristics are shown in table 1. Patients having HF events had a negative PETCO2 gradient while patients not having events showed a positive PETCO2 gradient (−1.5 [IQR −4.8, 2.3] vs 3 [IQR 1, 5] mmHg; p=0.004). A multivariate Cox proportional-hazards regression analysis revealed that PETCO2 gradient was an independent predictor of HF events (HR 0.74, 95% CI [0.61–0.89]; p=0.002). Kaplan-Meier curves showed a significantly higher incidence of HF events in patients having negative gradients, p=0.002 (figure 1). Conclusion PETCO2 gradient was demonstrated to be a prognostic parameter of CPET in HF patients in our study. Patients having negative gradients had worse outcomes by having more HF events. Time to first event, decompensated heart Funding Acknowledgement Type of funding source: None


Sign in / Sign up

Export Citation Format

Share Document