Puncture Injuries Due to Needles Removed From Intravenous Lines: Should the Source Patient Routinely be Tested for Bloodborne Infections?

1993 ◽  
Vol 14 (6) ◽  
pp. 325-330 ◽  
Author(s):  
Farrin A. Manian ◽  
Lynn Meyer ◽  
Joan Jenne

AbstractObjective:To better assess the risk of exposure to bloodborne pathogens following puncture injuries due to needles removed from intravenous(IV) lines.Setting:Tertiary care community medical center.Patients:A convenience sample of hospitalized patients requiring IV piggy-back medications.Methods:Examination of 501 IV ports of peripheral lines, heparin-locks, and central venous lines for visible blood and testing the residual fluid in the needles removed from these ports for the presence of occult blood by using guaiac-impregnated paper.Results:The proximal ports of central venous lines and heparin-locks were statistically more likely to contain visible blood than proximal and distal ports of peripheral lines (17% and 20% versus 1% and 3% respectively, P<0.05). Similarly, needles removed from proximal ports of central venous lines and heparin-locks were statistically more likely to contain occult blood than those from peripheral lines ( 11% and 14% versus 2%, respectively, P<0.05). Only two needles removed from IV lines without visible blood contained occult blood: one from the proximal port of a central line and another from a heparin-lock. None of the needles from peripheral lines without visible blood contained occult blood.Estimation of the risk of transmission of hepatitis B and C and human immunodeficiency virus (HIV) following injury by needles from various IV lines revealed that injury due to needles removed from peripheral IV lines and distal ports of central lines without visible blood was associated with “near zero” risk of transmission of these bloodborne infections at our medical center.Conclusions:Routine serological testing of source patients involving injury due to needles removed from peripheral IV lines and distal ports of central lines without visible blood is not necessary at our medical center. Conversely, due to the relatively high rate of occult blood in the needles removed from proximal ports of central venous lines and heparin-locks, puncture injuries due to these needles are considered significant and managed accordingly.

2020 ◽  
Vol 41 (S1) ◽  
pp. s142-s143
Author(s):  
Priya Sampathkumar ◽  
Kyle Rodino ◽  
Stacy (Tram) Ung

Background: Blood cultures are part of the evaluation of hospital patients with fever. Patients with central lines in place, frequently have blood samples for culture drawn through lines. We sought to assess blood culturing practices at our institution. Methods: Retrospective review of BCs performed in hospitalized patients over a 12-month period (August 2018–July 2019) at an academic, tertiary-care center with 1,297 licensed beds and >62,000 admissions a year. A specialized phlebotomy team is involved in all peripherally drawn blood samples; however, the patient’s nurse obtains a blood sample through a central line. Results: Overall, 35,121 blood cultures were performed for an incidence rate of 106 BC per 1,000 patient days or 566 blood cultures per 1,000 admissions. Most blood samples (67%) were collected via peripheral venipuncture. We detected significant variation in culturing rates and the proportion of blood samples obtained through central lines among collecting units (Table 1). Overall, the blood culture contamination rate was 1.6%. Blood samples obtained through a central line had a higher contamination rate (2.2%) compared to samples obtained through peripheral venipuncture (1.3%; P < .0001). Blood culture rates were highest in intensive care units (ICUs) compared with other types of patient care units (Table 1). The blood culture positivity rate was significantly lower in ICUs (8.8%) compared with hematology-oncology (10%; HR, 0.88; CI, 0.80–0.96; P = .006), general medicine (10%; HR, 0.88; CI, 0.80–0.97; P = .013), and pediatrics (12%; HR, 0.74; CI, 0.59–0.92; P = .008). The ICUs had the lowest rate of BC contamination at 1.3%. Conclusions: Blood samples obtained through central lines for culture are more likely to be contaminated than peripherally drawn blood samples. Despite a relatively high rate of line-drawn blood samples for culture, ICUs had the lowest BC contamination rate, possibly reflecting high familiarity of ICU nurses with line draws. Blood samples collected through lines were most frequently performed in pediatrics and hematology-oncology, and these units had correspondingly higher rates of contamination. This information will be used to inform institutional guidelines on blood culturing and to identify ways to minimize blood culture contamination, which often results in additional testing and/or unnecessary antimicrobial use.Funding: NoneDisclosures: Consulting fee- Merck (Priya Sampathkumar)


2014 ◽  
Vol 35 (9) ◽  
pp. 1140-1146 ◽  
Author(s):  
Cathleen Concannon ◽  
Edwin van Wijngaarden ◽  
Vanessa Stevens ◽  
Ghinwa Dumyati

ObjectiveThe current central line–associated bloodstream infection (CLABSI) surveillance rate calculation does not account for multiple concurrent central venous catheters (CVCs). The presence of multiple CVCs creates more points of entry into the bloodstream, potentially increasing CLABSI risk. Multiple CVCs may be used in sicker patients, making it difficult to separate the relative contributions of multiple CVCs and comorbidities to CLABSI risk. We explored the relative impact of multiple CVCs, patient comorbidities, and disease severity on the risk of CLABSI.DesignCase-control study.SettingA total of 197 case patients and 201 control subjects with a CVC inserted during hospitalization at a tertiary care academic medical center from January 1, 2008, to December 31, 2010.MethodsMultiple CVCs was the exposure of interest; the primary outcome was CLABSI. Multivariable logistic regression was conducted to estimate odds ratios (ORs) and 95% confidence intervals (CIs) describing the association between CLABSI and multiple CVCs with and without controlling for Acute Physiology and Chronic Health Evaluation (APACHE) II and Charlson comorbidity index (CCI) scores as measures of disease severity and patient comorbidities, respectively.ResultsPatients with multiple CVCs (n = 78) showed a 4.2 (95% CI, 2.2–8.4) times greater risk of CLABSI compared with patients with 1 CVC after adjusting for CLABSI risk factors. When including APACHE II and CCI scores, multiple CVCs remained an independent risk factor for CLABSI (OR, 3.4 [95% CI, 1.7–6.9]).ConclusionsMultiple CVCs is an independent risk factor for CLABSI even after adjusting for severity of illness. Adjustment for this risk may be necessary to accurately compare rates between hospitals.Infect Control Hosp Epidemiol 2014;35(9):1140-1146


2015 ◽  
Vol 20 (3) ◽  
pp. 179-188 ◽  
Author(s):  
Nancy Moureau ◽  
Gordon Sigl ◽  
Margaret Hill

Abstract Introduction: Establishing an effective midline program involves more than simply learning an insertion technique for a new product. Midline catheters provide a reliable vascular access option for those patients with difficult venous access who would otherwise require multiple venipunctures or the use of higher-risk central lines to maintain access. An effective midline program establishes a protocol for device selection and includes standing orders to facilitate speed to placement. Methods: Our retrospective descriptive review evaluated the successful integration of midline programs into existing vascular access bedside insertion programs in 2 acute care hospitals. The investigator reviewed a convenience sample of hospital patients. Participants in the study included vascular access team managers and team members from the sample sites. Results: The results of this 2-hospital study demonstrate successful integration of a midline program into a bedside insertion program with 0 midline-related infections since initiation. Documentation of overall central line-associated bloodstream infection rates for hospital 1 changed from 1.7/1000 catheter-days to 0.2/1000 catheter-days, reflecting a 78% reduction in infections and a projected cost avoidance of $531,570 annually. Both hospitals demonstrated reduced rates of infection following implementation of a midline program. Conclusions: Midlines have a history of lower risk for both infection and thrombosis compared with central venous devices. Although more research is needed on the more recently developed midline catheters, available evidence suggests that midlines provide a safe and reliable form of vascular access, reducing costs and the risk of infection associated with central venous catheters, especially those placed solely for patients with difficult venous access.


2020 ◽  
Vol 41 (S1) ◽  
pp. s258-s258
Author(s):  
Madhuri Tirumandas ◽  
Theresa Madaline ◽  
Gregory David Weston ◽  
Ruchika Jain ◽  
Jamie Figueredo

Background: Although central-line–associated bloodstream infections (CLABSI) in US hospitals have improved in the last decade, ~30,100 CLABSIs occur annually.1,2 Central venous catheters (CVC) carry a high risk of infections and should be limited to appropriate clinical indications.6,7 Montefiore Medical Center, a large, urban, academic medical center in the Bronx, serves a high-risk population with multiple comobidities.8–11 Despite this, the critical care medicine (CCM) team is often consulted to place a CVC when a peripheral intravenous line (PIV) cannot be obtained by nurses or primary providers. We evaluated the volume of CCM consultation requests for avoidable CVCs and related CLABSIs. Methods: Retrospective chart review was performed for patients with CCM consultation requests for CVC placement between July and October 2019. The indication for CVC, type of catheter inserted or recommended, and NHSN data were used to identify CLABSIs. CVCs were considered avoidable if a PIV was used for the stated indication and duration of therapy, with no anatomical contraindications to PIV in nonemergencies, according to the Michigan Appropriateness Guide for Intravenous Catheters (MAGIC).6Results: Of 229 total CCM consults, 4 (18%) requests were for CVC placement; 21 consultations (9%) were requested for avoidable CVCs. Of 40 CVC requests, 18 (45%) resulted in CVC placement by the CCM team, 4 (10%) were deferred for nonurgent PICC by interventional radiology, and 18 (45%) were deferred in favor of PIV or no IV. Indications for CVC insertion included emergent chemotherapy (n = 8, 44%) and dialysis (n = 3, 16%), vasopressors (n = 3, 16%), antibiotics (n = 2, 11%) and blood transfusion (n = 2, 11%). Of 18 CVCs, 9 (50%) were potentially avoidable: 2 short-term antibiotics and rest for nonemergent indications; 2 blood transfusions, 1 dialysis, 2 chemotherapy and 2 vasopressors. Between July and October 2019, 6 CLABSIs occurred in CVCs placed by the CCM team; in 3 of 6 CLABSI events (50%), the CVC was avoidable. Conclusions: More than half of consultation requests to the CCM team for CVCs are avoidable, and they disproportionately contribute to CLABSI events. Alternatives for intravenous access could potentially avoid 9% of CCM consultations and 50% of CLABSIs in CCM-inserted CVCs on medical-surgical wards.Funding: NoneDisclosures: None


2020 ◽  
Vol 41 (S1) ◽  
pp. s195-s195
Author(s):  
Josephine Fox ◽  
Robert Russell ◽  
Lydia Grimes ◽  
Heather Gasama ◽  
Carrie Sona ◽  
...  

Background: Proper care and maintenance of central lines is essential to prevent central-line–associated bloodstream infections (CLABSI). Our facility implemented a hospital-wide central-line maintenance bundle based on CLABSI prevention guidelines. The objective of this study was to determine whether maintenance bundle adherence was influenced by nursing shift or the day of week. Methods: A central-line maintenance bundle was implemented in April 2018 at a 1,266-bed academic medical center. The maintenance bundle components included alcohol-impregnated disinfection caps on all ports and infusion tubing, infusion tubing dated, dressings, not damp or soiled, no oozing at insertion site greater than the size of a quarter, dressings occlusive with all edges intact, transparent dressing change recorded within 7 days, and no gauze dressings in place for >48 hours. To monitor bundle compliance, 4 non–unit-based nurse observers were trained to audit central lines. Observations were collected between August 2018 and October 2019. Observations were performed during all shifts and 7 days per week. Just-in-time feedback was provided for noncompliant central lines. Nursing shifts were defined as day (7:00 a.m. to 3:00 p.m.), evening (3:00 p.m. to 11:00 p.m.), and night (11:00 p.m. to 7:00 a.m.). Central-line bundle compliance between shifts were compared using multinomial logistic regression. Bundle compliance between week day and weekend were compared using Mantel-Haenszel 2 analysis. Results: Of the 25,902 observations collected, 11,135 (42.9%) were day-shift observations, 11,559 (44.6%) occurred on evening shift, and 3,208 (12.4%) occurred on the night shift. Overall, 22,114 (85.9%) observations occurred on a week day versus 3,788 (14.6%) on a Saturday or Sunday (median observations per day of the week, 2,570; range, 1,680–6,800). In total, 4,599 CLs (17.8%) were noncompliant with >1 bundle component. The most common reasons for noncompliance were dressing not dated (n = 1,577; 44.0%) and dressings not occlusive with all edges intact (n = 1340; 37.4%). The noncompliant rates for central-line observations by shift were 12.8% (1,430 of 1,1,135) on day shift, 20.4% (2,361 of 11,559) on evening shift, and 25.2% (808 of 3,208) on night shift. Compared to day shift, evening shift (OR, 1.74; 95% CI, 1.62–1.87; P < .001) and night shift (OR, 2.29; 95% CI, 2.07–2.52; P < .001) were more likely to have a noncompliant central lines. Compared to a weekday, observations on weekend days were more likely to find a noncompliant central line: 914 of 3,788 (24.4%) weekend days versus 3,685 of 22,114 (16.7%) week days (P < .001). Conclusions: Noncompliance with central-line maintenance bundle was more likely on evening and night shifts and during the weekends.Funding: NoneDisclosures: None


1993 ◽  
Vol 2 (2) ◽  
pp. 161-167 ◽  
Author(s):  
EH Elpern ◽  
SB Yellen ◽  
LA Burton

BACKGROUND: Advance directives are a means of promoting patient autonomy in end-of-life decisions but are used infrequently. A recent federal law requires healthcare organizations to provide information to patients about advance directives. This study explored attitudes and behaviors related to the use of advance directives in three areas: familiarity with advance directives, reasons for completing or not completing advance directives and preferences for receiving information about advance directives. METHODS: A questionnaire was administered by personal interview to a nonrandomized convenience sample of 46 inpatients and 50 outpatients at a large, tertiary care, urban academic medical center in the summer of 1991. RESULTS: Most respondents (77%) had heard of either the living will or durable power of attorney for healthcare, but only 52% correctly understood the purpose of these documents. Twenty-nine percent of the sample had executed an advance directive. Those who had advance directives were older and considered themselves less healthy than did those without advance directives. Unfamiliarity with advance directives and procrastination were cited most often as reasons for not having an advance directive. Most subjects (65%) had spoken with someone, usually a family member or close friend, about preferences for treatment during a critical illness. Although they had rarely discussed advance directives, 83% anticipated that they would be comfortable doing so with a physician or a nurse. CONCLUSIONS: Advance directives are used infrequently to document treatment preferences. The success of programs to promote greater use of advance directives depends on a clearer understanding of the factors that influence both decision and action to execute an advance directive. Patients claim to be comfortable in discussing the topic and prefer that such discussions occur in the outpatient setting.


2007 ◽  
Vol 28 (7) ◽  
pp. 774-782 ◽  
Author(s):  
Emily M. O'Malley ◽  
R. Douglas Scott ◽  
Julie Gayle ◽  
John Dekutoski ◽  
Michael Foltzer ◽  
...  

Objective.To determine the cost of management of occupational exposures to blood and body fluids.Design.A convenience sample of 4 healthcare facilities provided information on the cost of management of occupational exposures that varied in type, severity, and exposure source infection status. Detailed information was collected on time spent reporting, managing, and following up the exposures; salaries (including benefits) for representative staff who sustained and who managed exposures; and costs (not charges) for laboratory testing of exposure sources and exposed healthcare personnel, as well as any postexposure prophylaxis taken by the exposed personnel. Resources used were stratified by the phase of exposure management: exposure reporting, initial management, and follow-up. Data for 31 exposure scenarios were analyzed. Costs were given in 2003 US dollars.Setting.The 4 facilities providing data were a 600-bed public hospital, a 244-bed Veterans Affairs medical center, a 437-bed rural tertiary care hospital, and a 3,500-bed healthcare system.Results.The overall range of costs to manage reported exposures was $71-$4,838. Mean total costs varied greatly by the infection status of the source patient. The overall mean cost for exposures to human immunodeficiency virus (HIV)-infected source patients (n = 19, including those coinfected with hepatitis B or C virus) was $2,456 (range, $907-$4,838), whereas the overall mean cost for exposures to source patients with unknown or negative infection status (n = 8) was $376 (range, $71-$860). Lastly, the overall mean cost of management of reported exposures for source patients infected with hepatitis C virus (n = 4) was $650 (range, $186-$856).Conclusions.Management of occupational exposures to blood and body fluids is costly, the best way to avoid these costs is by prevention of exposures.


Blood ◽  
2005 ◽  
Vol 106 (11) ◽  
pp. 4186-4186
Author(s):  
Ian H. Chin-Yee ◽  
Anargyros Xenocostas ◽  
Wendy W. Cheung ◽  
Anita S. Hibbert

Abstract In oncology patients, the majority of chemotherapy and red blood cell (RBC) transfusions occur in outpatient ‘chemotherapy’ units, where they compete for resources such as nursing time and “chair-time”. This study was to accurately assess the “chair-time” consumed by transfusion patients, in order to estimate the chemotherapy administration opportunities lost to RBC transfusions. Over four weeks, “chair-time”, defined as the time difference between the admission of each patient into care to their time of discharge, was prospectively evaluated in a tertiary-care outpatient cancer clinic with a referral population base of 2 million. Chair-times were then grouped into three types of care - RBC transfusions, chemotherapy administrations, and “other” (phlebotomy, central line catheter care, etc.) - to enable comparison. Chair-time is reported as a mean (+/− SD). Patient demographics (age, sex, diagnosis, chemotherapy regimen, pre-transfusion hemoglobin) were also recorded. A total of 1354 visits to the chemotherapy suite were captured over one month. Of these, 1279 visits had evaluable data for further analysis, and can be divided as follows: 1023 (80%) chemotherapy administrations, 44 (3.4%) RBC transfusions, and 212 (16.6%) “other”. 38 patients accounted for the 44 RBC transfusions. Of those, 14 were hematological malignancy patients (ALL, AML, CLL, HD, Myeloma, Lymphoma), 12 were solid tumor patients and the remaining 12 had other hematological disorders (Aplastic Anaemia, Myelodysplasia, Myelofibrosis). Among the malignant patients, 20 were receiving chemotherapy during the study period. The mean chair-time for all accurately recorded events was 1 hr 49 min (+/− 1 hr 39 min). Divided into types of care, the mean chair times were: 1 hr 59 min (+/− 1 hr 40 min) for chemotherapy, 3 hr 51 min (+/− 47 min) for RBC transfusion, and 34 min (+/− 43 min) for “other” care. The average time per RBC unit transfused was 1 hr 49 min (+/− 19 min) and the average number of units per transfusion was 2.2 units. When chemotherapy chair-times were examined, and patients were grouped by diagnoses, it was found that patients with lymphoma (most commonly treated with R-CHOP, or other Rituximab containing regimens), and gynecological cancers (most commonly treated with regimens containing carboplatin) had the longest chair-times, at 4 hr 20 min (+/− 1 hr 24 min) and 3 hr 50 min (+/− 2 hr 11 min) respectively. Although RBC transfusions make up only 3.4% of all events in our chemotherapy suite, they occupy almost twice as much chair-time as compared to chemotherapy. Depending on the patient population, clinics with a high rate of RBC transfusions might consider transfusion alternatives, as emerging monoclonal antibody chemotherapies augment the time necessary for administering chemotherapy, and chair-time becomes an increasingly valuable resource.


Sign in / Sign up

Export Citation Format

Share Document