scholarly journals Practice Variation in Validation of Device Denominator Data for National Healthcare Safety Network Reporting

2020 ◽  
Vol 41 (S1) ◽  
pp. s354-s355
Author(s):  
Douglas Challener ◽  
Priya Sampathkumar ◽  
John O O’Horo

Background: The NHSN is a widely used CDC program for tracking healthcare-associated infections (HAIs). The goal of the NHSN is to help healthcare organizations to identify and track the incidence of HAI and to prevent adverse events as well as to simplify mandatory quality reporting to the CMS. Healthcare organizations provide both event data for HAIs and information about the population at risk. For device-related infections, device denominator data (eg, data related to urinary or intravascular catheters, and ventilators) must be collected and reported. NHSN guidelines require that electronic reporting of device denominator numbers be validated to be within 5% of manually collected counts over a period of 3 consecutive months. Little is known about current practical application of validation practices. Methods: We surveyed members of the SHEA Research Network (SRN) to assess awareness of and compliance with the current NHSN requirements for device denominator data validation. Results: The survey was sent to 89 member institutions of the SRN from November 20, 2018, to December 12, 2018. The response rate was 35.7%, and 90% of respondents are currently using an electronic system for device denominator count reporting. All except 1 institution manually validated the data. Of the facilities that had completed validation, 31% used <90 days of manual data. Moreover, 82% of these facilities found a difference of <5% between the electronic data and manual data without a statistically significant difference between those with at least 90 days of validation data and those with <90 days. Also, 21% of facilities validated data based on a subset of units. Conclusions: Although most respondents to the survey validate electronically collected device denominator data in accordance with NHSN’s requirements, nearly one-third reported using shorter validation periods than NHSN requires. However, shorter periods were not associated with worse concordance. The NHSN should evaluate whether the burden of a 3-month validation period is justified.Funding: NoneDisclosures: None

2020 ◽  
Vol 41 (S1) ◽  
pp. s389-s389
Author(s):  
Jeremy Goodman ◽  
Samuel Clasp ◽  
Arjun Srinivasan ◽  
Elizabeth Mothershed ◽  
Seth Kroop ◽  
...  

Background: Healthcare-associated infections (HAIs) are a serious threat to patient safety; they account for substantial morbidity, mortality, and healthcare costs. Healthcare practices, such as inappropriate use of antimicrobials, can also amplify the problem of antimicrobial resistance. Data collected to target HAI prevention and antimicrobial stewardship efforts and measure progress are an important resource for assuring transparency and accountability in healthcare, tracking adverse outcomes, investigating healthcare practices that may spread or protect against disease, detecting and responding to the spread of resistant pathogens, preventing infections, and saving lives. Methods: We discuss 3 healthcare-associated infection and antimicrobial Resistant infection (HAI-AR) reporting types: NHSN HAI-AR reporting, reportable diseases, and nationally notifiable diseases. HAI-AR reporting requirements outline facilities and data to report to NHSN and the health department to comply with state laws. Reportable diseases are those that facilities, providers, and laboratories are required to report to the health department. Nationally notifiable diseases are those reported by health departments to the CDC for nationwide surveillance and analysis as determined by Council of State and Territorial Epidemiologists (CSTE) and the CDC. Data presented are based on state and federal policy; NHSN data are based on CDC reporting statistics. Results: Since the 2005 launch of the CDC NHSN and publication of federal advisory committee HAI reporting guidance, most states have established policies stipulating healthcare facilities in their jurisdiction report HAIs and resistant infections to the NHSN to gain access to those data, increasing from 2 states in 2005, to 18 in 2010, and to 36 states, Washington, DC, and Philadelphia in 2019. Reporting policies and NHSN participation expanded greatly following the 2011 inception of CMS HAI quality reporting requirements, with several states aligning state requirements with CMS reporting. States listing carbapenem-resistant Enterobacteriaceae (CRE) as a reportable disease increased from 7 in 2013 to 41 states and the District of Columbia in 2019. Vancomycin-intermediate and vancomycin-resistant Staphylococcus aureus (VISA/VRSA) was added as a nationally notifiable disease in 2004, carbapenemase-producing CRE (CP-CRE) was added in 2018, and Candida auris clinical infections were added in 2019. The CDC and most jurisdictions with HAI reporting mandates issue public reports based on aggregate state data and/or facility-level data. States may also alert healthcare providers and health departments of emerging threats and to assist in notifying patients of potential exposure. Conclusions: Through efforts by health departments, facilities, patient advocates, partners, the CDC, and other federal agencies, HAI-AR reporting has steadily increased. Although reporting laws and data uses vary between jurisdictions, data provided serves as valuable tools to inform prevention.Funding: NoneDisclosures: None


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
J Redfern ◽  
K Hyun ◽  
D Brieger ◽  
D Chew ◽  
J French ◽  
...  

Abstract Background Cardiovascular disease is the leading cause of disease burden globally. With advancements in medical and surgical care more people are surviving initial acute coronary syndrome (ACS) and are in need of secondary prevention and cardiac rehabilitation (CR). Increasing availability of high quality individual-level data linkage provides robust estimates of outcomes long-term. Purpose To compare 3 year outcomes amongst ACS survivors who did and did not participate in Australian CR programs. Methods SNAPSHOT ACS follow-up study included 1806 patients admitted to 232 hospitals who were followed-up by data linkage (cross-jurisdictional morbidity, national death index, Pharmaceutical Benefit Schedule) at 6 and 36 months to compare those who did/not attend CR. Results In total, the cohort had a mean age of 65.8 (13.4) years, 60% were male, only 25% (461/1806) attended CR. During index admission, attendees were more likely to have had PCI (39% v 14%, p&lt;0.001), CABG (11% v 2%, p&lt;0.001) and a diagnosis of STEMI (21% v 5%, p&lt;0.001) than those who did not attend. However, there was no significant difference between CR attendees/non-attendees for risk factors (LDL-cholesterol, smoking, obesity). Only 19% of eligible women attended CR compared to 30% of men (p&lt;0.001). At 36 months, there were fewer deaths amongst CR attendees (19/461, 4.1%) than non-attendees (116/1345, 8.6%) (p=0.001). CR attendees were more likely to have repeat ACS, PCI, CABG at both 6 and 36 months (Table). At 36 months, CR attendees were more likely to have been prescribed antiplatelets (78% v 53%, p&lt;0.001), statins (91% 73%, p&lt;0.001), beta-blockers (11% v 13%, p=0.002) and ACEI/ARBs (72% v 61%, p&lt;0.001) than non-attendees. Conclusions Amongst Australian ACS survivors, participation in CR was associated with less likelihood of death and increased prescription of pharmacotherapy. However, attendance at CR was associated with higher rates of repeat ACS and revascularisation. Funding Acknowledgement Type of funding source: Foundation. Main funding source(s): New South Wales Cardiovascular Research Network, National Heart Foundation


Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4859
Author(s):  
Leigh Stanger ◽  
Thomas Rockett ◽  
Alistair Lyle ◽  
Matthew Davies ◽  
Magnus Anderson ◽  
...  

This article elucidates the need to consider the inherent spatial transfer function (blur), of any thermographic instrument used to measure thermal fields. Infrared thermographic data were acquired from a modified, commercial, laser-based powder bed fusion printer. A validated methodology was used to correct for spatial transfer function errors in the measured thermal fields. The methodology was found to make a difference of 40% to the measured signal levels and a 174 °C difference to the calculated effective temperature. The spatial gradients in the processed thermal fields were found to increase significantly. These corrections make a significant difference to the accuracy of validation data for process and microstructure modeling. We demonstrate the need for consideration of image blur when quantifying the thermal fields in laser-based powder bed fusion in this work.


2016 ◽  
Vol 42 (2-3) ◽  
pp. 393-428
Author(s):  
Ann Marie Marciarille

The narrative of Ebola's arrival in the United States has been overwhelmed by our fear of a West African-style epidemic. The real story of Ebola's arrival is about our healthcare system's failure to identify, treat, and contain healthcare associated infections. Having long been willfully ignorant of the path of fatal infectious diseases through our healthcare facilities, this paper considers why our reimbursement and quality reporting systems made it easy for this to be so. West Africa's challenges in controlling Ebola resonate with our own struggles to standardize, centralize, and enforce infection control procedures in American healthcare facilities.


2016 ◽  
Vol 155 (1) ◽  
pp. 28-32 ◽  
Author(s):  
Walter T. Lee ◽  
David L. Witsell ◽  
Kourosh Parham ◽  
Jennifer J. Shin ◽  
Nikita Chapurin ◽  
...  

Objectives (1) Compare postoperative bleeding in the CHEER network (Creating Healthcare Excellence through Education and Research) among age groups, diagnoses, and practice types. (2) Report the incidence of bleeding by individual CHEER practice site based on practice guidelines. Study Design Retrospective data collection database review of the CHEER network based on ICD-9 and CPT codes related to tonsillectomy patients. Setting Multisite practice–based network. Subjects and Methods A total of 8347 subjects underwent tonsillectomy as determined by procedure code within the retrospective data collection database, and 107 had postoperative hemorrhage. These subjects had demographic information and related diagnoses based on the CPT and ICD-9 codes collected. Postoperative ICD-9 and CPT codes were used to identify patients who also had postoperative bleed. Variables included age (<12 vs ≥12 years), diagnoses (infectious vs noninfectious), and practice type (community vs academic). Statistical analysis included multivariate logistic regression variables predictive of postoperative bleeding, with P < .05 considered significant. Results Thirteen sites contributed data to the study (7 academic, 6 community). There was postoperative bleeding for an overall bleed rate of 1.3%. Patients ≥12 years old had a significantly increased bleed rate when compared with the younger group (odds ratio, 5.98; 95% confidence interval: 3.79-9.44; P < .0001). There was no significant difference in bleed rates when practices or diagnoses were compared. Conclusion A site descriptor database built to expedite clinical research can be used for practice assessment and quality improvement. These data were also useful to identify patient risk factors for posttonsillectomy bleed.


2018 ◽  
Vol 5 (suppl_1) ◽  
pp. S402-S402
Author(s):  
Tomislav Mestrovic ◽  
Goran Kozina ◽  
Marijana Neuberg ◽  
Rosana Ribic

Abstract Background Adequate training of health workers is pivotal in the prevention of healthcare-associated infections (HAI). Our aim was to assess the theoretical and applied knowledge about the risk factors and effective measures of HAI prevention (most notably the use of standard precautions and hand hygiene practices) in second year undergraduate university nursing students that have already completed obligatory courses in microbiology, infectious diseases and epidemiology. Methods This study included a whole generation of second year undergraduate nursing students, comprised of 161 female and 25 male participants (186 in total), from a public university in Croatia (University Centre Varaždin, University North). They were given an anonymous questionnaire (developed on the model used by Tavolacci et al. in 2008) covering three domains: General Knowledge of HAI, Standard Precautions (SP) and Hand Hygiene (HH). The acceptable score overall (max. 30) and for each area (max. 10) was arbitrarily set at ≥ 20 and ≥ 7, respectively (in accordance with prior research). Results The age range of surveyed students was 19–37 (mean: 21.97, median: 21, mod: 20). An accurate definition of nosocomial infections was provided by 98.92% students (with 60.75% of them defining it as the infection occurring 48 hours after hospital admission). The overall score was 21.5, which indicates sufficient level of applied knowledge of healthcare-associated infections. Very high level of knowledge was observed for the SP area (total score of 9.5); however, the level of knowledge in HAI and HH domains was inadequate (5.9 and 6.1, respectively). There was no statistically significant difference in the overall or specific scores between male and female students (P &gt; 0.05). Formal teaching during the curriculum was students’ primary source of information (60.22%), followed by practical learning in the ward during work (23.65%), formal teaching in the ward (9.68%) and self-learning (6.45%). Conclusion Periodical checking of nursing students’ knowledge on HAI and corresponding curriculum modifications in obligatory courses tackling this topic are advised in order to fill the knowledge gaps, improve training, reduce infection rates and increase compliance with prevention measures. Disclosures All authors: No reported disclosures.


2013 ◽  
Vol 34 (11) ◽  
pp. 1174-1180 ◽  
Author(s):  
Aiysha Ansari ◽  
Padmaja Ramaiah ◽  
Lillian Collazo ◽  
Hamisu M. Salihu ◽  
Donna Haiduven

Objective.To determine whether retractable intravenous devices produced blood splatter and whether blood splatter frequency differed between visual and microscopy detection methods.Methods.In this laboratory-based experiment, 105 venipunctures were performed in a simulated brachial vein containing mock venous blood. The retraction mechanism was activated in a testing chamber with precut fabric filters, placed at 3 different locations, to capture blood splatter. Differences in filter mass, visual inspection, and microscopic analysis for presence of blood on filters were the units of analysis. Descriptive statistics, paired Student t tests, and k statistics were used for data analysis.Results.Blood splatter was detected visually and microscopically as follows: filter A, 70% and 71%, respectively; filter B, 12% and 9%, respectively; and filter C, 13% and 10%, respectively. A statistically significant difference was observed in the mean mass of filter A between before and after activation when confirmed by the naked eye (P = .014) and microscopically (P = .0092). Substantial agreement between methods was observed for filter A (k = 0.78 [95% confidence interval, 0.64-0.92]), filter B (k = 0.73 [95% confidence interval, 0.51-0.95]), and filter C (k = 0.75 [95% confidence interval, 0.55-0.96]). However, blood was detected by microscopy and not by the naked eye in 7 instances (7%).Conclusions.Our findings demonstrate that splatter, which can potentially expose healthcare workers (HCWs) to bloodborne pathogens, is associated with the activation of intravascular catheters with retraction mechanisms. HCWs may not detect this splatter when it occurs and may not report a splash to mucous membranes or nonintact skin. The need to wear personal protective equipment when using such devices is reinforced.


2019 ◽  
Vol 2019 ◽  
pp. 1-7 ◽  
Author(s):  
Huixue Jia ◽  
Liuyi Li ◽  
Weiguang Li ◽  
Tieying Hou ◽  
Hongqiu Ma ◽  
...  

Healthcare-associated infections (HAIs) not only bring additional medical cost to the patients but also prolong the length of stay (LOS). 2119 HAI case-patients and 2119 matched control-patients were identified in 68 hospitals in 14 primary sampling provinces of 7 major regions of China. The HAI caused an increase in stay of 10.4 days. The LOS due to HAI increased from 9.7 to 10.9 days in different levels of hospitals. There was no statistically significant difference in the increased LOS between different hospital levels. The increased LOS due to HAI in different regions was 8.2 to 12.6 days. Comparing between regions, we found that the increased LOS due to HAI in South China is longer than other regions except the Northeast. The gastrointestinal infection (GI) caused the shortest extra LOS of 6.7 days while the BSI caused the longest extra LOS of 12.8 days. The increased LOS for GI was significantly shorter than that of other sites. Among 2119 case-patients, the non-multidrug-resistant pathogens were detected in 365 cases. The average increased LOS due to these bacterial infections was 12.2 days. E. coli infection caused significantly shorter LOS. The studied MDROs, namely, MRSA, VRE, ESBLs-E. coli, ESBLs-KP, CR-E. coli, CR-KP, CR-AB, and CR-PA were detected in 381 cases (18.0%). The average increased LOS due to these MDRO infections was 14 days. Comparing between different MDRO infections, we found that the increased LOS due to HAI caused by CR-PA (26.5 days) is longer than other MDRO infections (shorter than 19.8 days).


Blood ◽  
2007 ◽  
Vol 110 (11) ◽  
pp. 2773-2773
Author(s):  
Patrick B. Walter ◽  
John Porter ◽  
Patricia Evans ◽  
Janet L. Kwiatkowski ◽  
Ellis J. Neufeld ◽  
...  

Abstract Iron overload has been shown to increase mitochondrial dysfunction and apoptosis and may be implicated in leukocyte apoptosis. We assessed whether markers of leukocyte apoptosis and mitochondrial dysfunction are higher in iron-overloaded thalassemia patients compared with control subjects and whether improvement in the control of iron overload with deferasirox or deferoxamine (DFO) is associated with a change in the level of apoptotic markers. Methods: Thalassemia Clinical Research Network patients participating in the Novartis CICL670A0107 trial (a randomized comparison of deferasirox, an oral iron chelator, vs. DFO) were eligible and 44 (25 male, 21.8 yrs) were enrolled in the study. Fasting blood samples were obtainedafter a 5-day washout of DFO prior to commencing treatment with study drug, and24 hours post-chelator at 1, 6, and 12 months on study. Thirty healthy controls matched for age, sex, race and antioxidant usage (15 male, 24.5 yrs) also supplied a blood sample at baseline. After blood collection, plasma, serum and cells were separated by centrifugation. The pro-apoptotic marker Bax (an inducer of mitochondrial dysfunction) and the anti-apoptotic marker Bcl-2 were determined by ELISA. A high ratio of Bax/Bcl-2 indicates decreased stability of the mitochondrial outer membrane and increased potential for mitochondrial dysfunction and apoptosis. Activity of caspase–3 and −9 (both pro-apoptotic) were determined by a luminescent enzyme activity assay and reported as the average relative light units (RLU)/μg protein. Apoptotic markers were log-transformed prior to analysis and means and % change are reported. Results: At baseline, thalassemia patients had elevated Bax compared to the controls (17.4 vs. 11.6 pg/μg protein, p = 0.006). Similarly the activity of caspase-3 and −9 was high relative to the controls (for caspase-3, 1823 vs. 1041 RLU/μg protein, p = 0.01; and for caspase-9, 2482 vs. 1322 RLU/μg protein, p = 0.001). In longitudinal analysis, liver iron concentration and ferritin declined in both treatment groups (p ≤ 0.02 for both). This paralleled a decline in the ratio of Bax/Bcl-2 (–27.3% /yr, average decline, p = 0.033) and ALT (–7.3% /yr, average decline, p = 0.040). There was no significant difference between chelators in the rate of change of these markers over time. Conclusions: These findings demonstrate that markers of leukocyte apoptosis and mitochondrial dysfunction are high in thalassemia compared to controls and that the Bax/Bcl-2 ratio (a marker of mitochondrial dysfunction in apoptosis) decreased with effective iron chelation. DFO and deferasirox showed equal effectiveness in decreasing iron burden, ALT and the ratio of Bax/Bcl-2. Thus, with effective chelation, it may be possible to improve leukocyte and mitochondrial function and the levels of apoptosis.


2019 ◽  
Vol 37 (15_suppl) ◽  
pp. 3135-3135
Author(s):  
Takeshi Murata ◽  
Takako Yanagisawa ◽  
Toshiaki Kurihara ◽  
Miku Kaneko ◽  
Sana Ota ◽  
...  

3135 Background: Saliva is non-invasively accessible and informative biological fluid which has high potential for the early diagnosis of various diseases. The aim of this study is to develop machine learning methods and to explore new salivary biomarkers to discriminate breast cancer patients from healthy controls. Methods: We conducted a comprehensive metabolite analysis of saliva samples obtained from 101 patients with invasive carcinoma (IC), 23 patients with ductal carcinoma in situ (DCIS) and 42 healthy controls, using capillary electrophoresis and liquid chromatography with mass spectrometry to quantify hundreds of hydrophilic metabolites. Saliva samples were collected under 9h fasting and were split into training and validation data. Conventional statistical analyses and artificial intelligence-based methods were used to access the discrimination abilities of the quantified metabolite. Multiple logistic regression (MLR) model and an alternative decision tree (ADTree)-based machine learning methods were used. The generalization abilities of these mathematical models were validated in various computational tests, such as cross-validation and resampling methods. Results: Among quantified 260 metabolites, amino acids and polyamines showed significantly elevated in saliva from breast cancer patients, e.g. spermine showed the highest area under the receiver operating characteristic curves (AUC) to discriminate IC from C; 0.766 (95% confidence interval [CI]; 0.671 – 0.840, P < 0.0001). These metabolites showed no significant difference between C and DICS, i.e., these metabolites were elevated only in the samples of IC. The MLR yielded higher AUC to discriminate IC from C; 0.790 (95% CI; 0.699 – 0.859, P < 0.0001). The ADTree with ensemble approach showed the best AUC; 0.912 (95% CI; 0.838 – 0.961, P < 0.0001). In the comparison of these metabolites in the analysis of each subtype, seven metabolites were significantly different between Luminal A-like and Luminal B-like while, but few metabolites were significantly different among the other subtypes. Conclusions: These data indicated the combination of salivary metabolomic profiles including polyamines showed potential ability to screening breast cancer in a non-invasive way.


Sign in / Sign up

Export Citation Format

Share Document