Alert fatigue: A lesson relearned

2010 ◽  
Vol 67 (8) ◽  
pp. 604-604
Author(s):  
Jared J. Cash ◽  
Stuart Levine
Keyword(s):  
2016 ◽  
Vol 25 (01) ◽  
pp. 70-72 ◽  
Author(s):  
A. Almerares ◽  
D. Luna ◽  
A. Marcelo ◽  
M. Househ ◽  
H. Mandirola ◽  
...  

SummaryBackground: Patient safety concerns every healthcare organization. Adoption of Health information technology (HIT) appears to have the potential to address this issue, however unanticipated and undesirable consequences from implementing HIT could lead to new and more complex hazards. This could be particularly problematic in developing countries, where regulations, policies and implementations are few, less standandarized and in some cases almost non-existing.Methods: Based on the available information and our own experience, we conducted a review of unintended consequences of HIT implementations, as they affect patient safety in developing countries.Results: We found that user dependency on the system, alert fatigue, less communications among healthcare actors and workarounds topics should be prioritize. Institution should consider existing knowledge, learn from other experiences and model their implementations to avoid known consequences. We also recommend that they monitor and communicate their own efforts to expand knowledge in the region.


2020 ◽  
Author(s):  
Paul Kengfai Wan ◽  
Abylay Satybaldy ◽  
Lizhen Huang ◽  
Halvor Holtskog ◽  
Mariusz Nowostawski

BACKGROUND Clinical decision support (CDS) is a tool that helps clinicians in decision making by generating clinical alerts to supplement their previous knowledge and experience. However, CDS generates a high volume of irrelevant alerts, resulting in alert fatigue among clinicians. Alert fatigue is the mental state of alerts consuming too much time and mental energy, which often results in relevant alerts being overridden unjustifiably, along with clinically irrelevant ones. Consequently, clinicians become less responsive to important alerts, which opens the door to medication errors. OBJECTIVE This study aims to explore how a blockchain-based solution can reduce alert fatigue through collaborative alert sharing in the health sector, thus improving overall health care quality for both patients and clinicians. METHODS We have designed a 4-step approach to answer this research question. First, we identified five potential challenges based on the published literature through a scoping review. Second, a framework is designed to reduce alert fatigue by addressing the identified challenges with different digital components. Third, an evaluation is made by comparing MedAlert with other proposed solutions. Finally, the limitations and future work are also discussed. RESULTS Of the 341 academic papers collected, 8 were selected and analyzed. MedAlert securely distributes low-level (nonlife-threatening) clinical alerts to patients, enabling a collaborative clinical decision. Among the solutions in our framework, Hyperledger (private permissioned blockchain) and BankID (federated digital identity management) have been selected to overcome challenges such as data integrity, user identity, and privacy issues. CONCLUSIONS MedAlert can reduce alert fatigue by attracting the attention of patients and clinicians, instead of solely reducing the total number of alerts. MedAlert offers other advantages, such as ensuring a higher degree of patient privacy and faster transaction times compared with other frameworks. This framework may not be suitable for elderly patients who are not technology savvy or in-patients. Future work in validating this framework based on real health care scenarios is needed to provide the performance evaluations of MedAlert and thus gain support for the better development of this idea. CLINICALTRIAL


2012 ◽  
Vol 143 (4) ◽  
pp. 332-333
Author(s):  
Titus Schleyer ◽  
Thankam P. Thyvalikakath
Keyword(s):  

2020 ◽  
Vol 10 (4) ◽  
pp. 142
Author(s):  
Brian J. Douthit ◽  
R. Clayton Musser ◽  
Kay S. Lytle ◽  
Rachel L. Richesson

(1) Background: The five rights of clinical decision support (CDS) are a well-known framework for planning the nuances of CDS, but recent advancements have given us more options to modify the format of the alert. One-size-fits-all assessments fail to capture the nuance of different BestPractice Advisory (BPA) formats. To demonstrate a tailored evaluation methodology, we assessed a BPA after implementation of Storyboard for changes in alert fatigue, behavior influence, and task completion; (2) Methods: Data from 19 weeks before and after implementation were used to evaluate differences in each domain. Individual clinics were evaluated for task completion and compared for changes pre- and post-redesign; (3) Results: The change in format was correlated with an increase in alert fatigue, a decrease in erroneous free text answers, and worsened task completion at a system level. At a local level, however, 14% of clinics had improved task completion; (4) Conclusions: While the change in BPA format was correlated with decreased performance, the changes may have been driven primarily by the COVID-19 pandemic. The framework and metrics proposed can be used in future studies to assess the impact of new CDS formats. Although the changes in this study seemed undesirable in aggregate, some positive changes were observed at the level of individual clinics. Personalized implementations of CDS tools based on local need should be considered.


2010 ◽  
Vol 67 (8) ◽  
pp. 603-604 ◽  
Author(s):  
William A. Gouveia
Keyword(s):  

2016 ◽  
Vol 21 (6) ◽  
pp. 203-207 ◽  
Author(s):  
Anne Press ◽  
Sundas Khan ◽  
Lauren McCullagh ◽  
Andy Schachter ◽  
Salvatore Pardo ◽  
...  

2020 ◽  
Author(s):  
Tahmina Nasrin Poly ◽  
Md.Mohaimenul Islam ◽  
Muhammad Solihuddin Muhtar ◽  
Hsuan-Chia Yang ◽  
Phung Anh (Alex) Nguyen ◽  
...  

BACKGROUND Computerized physician order entry (CPOE) systems are incorporated into clinical decision support systems (CDSSs) to reduce medication errors and improve patient safety. Automatic alerts generated from CDSSs can directly assist physicians in making useful clinical decisions and can help shape prescribing behavior. Multiple studies reported that approximately 90%-96% of alerts are overridden by physicians, which raises questions about the effectiveness of CDSSs. There is intense interest in developing sophisticated methods to combat alert fatigue, but there is no consensus on the optimal approaches so far. OBJECTIVE Our objective was to develop machine learning prediction models to predict physicians’ responses in order to reduce alert fatigue from disease medication–related CDSSs. METHODS We collected data from a disease medication–related CDSS from a university teaching hospital in Taiwan. We considered prescriptions that triggered alerts in the CDSS between August 2018 and May 2019. Machine learning models, such as artificial neural network (ANN), random forest (RF), naïve Bayes (NB), gradient boosting (GB), and support vector machine (SVM), were used to develop prediction models. The data were randomly split into training (80%) and testing (20%) datasets. RESULTS A total of 6453 prescriptions were used in our model. The ANN machine learning prediction model demonstrated excellent discrimination (area under the receiver operating characteristic curve [AUROC] 0.94; accuracy 0.85), whereas the RF, NB, GB, and SVM models had AUROCs of 0.93, 0.91, 0.91, and 0.80, respectively. The sensitivity and specificity of the ANN model were 0.87 and 0.83, respectively. CONCLUSIONS In this study, ANN showed substantially better performance in predicting individual physician responses to an alert from a disease medication–related CDSS, as compared to the other models. To our knowledge, this is the first study to use machine learning models to predict physician responses to alerts; furthermore, it can help to develop sophisticated CDSSs in real-world clinical settings.


2019 ◽  
Author(s):  
Devin Mann ◽  
Adam Szerencsy ◽  
Leora Horwitz ◽  
Simon Jones ◽  
Masha Kuznetsova ◽  
...  

BACKGROUND Clinical decision support (CDS) is a valuable feature of electronic health records (EHRs) designed to improve quality and safety. However, due to the complexities of system design and inconsistent results, CDS tools may inadvertently increase alert fatigue and contribute to physician burnout. A/B testing, or rapid-cycle randomized tests, is a useful method that can be applied to the EHR in order to understand and iteratively improve design choices embedded within CDS tools. OBJECTIVE This paper describes how rapid randomized controlled trials (RCTs) embedded within EHRs can be used to quickly ascertain the superiority of potential CDS tools to improve their usability, reduce alert fatigue and promote quality of care. METHODS A multi-step process combining tools from user-centered design, A/B testing and implementation science is used to understand, ideate, prototype, test, analyze and improve each candidate CDS. CDS engagement metrics (alert views, ignores, orders) are used to evaluate which CDS version is superior. RESULTS Two experiments are highlighted to demonstrate the impact of the process. First, after multiple rounds of usability testing, a revised CDS influenza alert was tested against usual care in a rapid RCT. The new alert text resulted in minimal impact but the failure triggered another round of testing that identified key issues and led to a 70% reduction in alert volume in the next round. In the second experiment, the process was used to test three versions (financial, quality, regulatory) of text supporting tobacco cessation alerts as well as three supporting images. Three rounds of RCTs showed that the financial framing was 5-10% more effective than the other two but that adding images did not have a positive impact. CONCLUSIONS These data support the potential for this new process to rapidly develop, deploy and improve CDS within an EHR. This approach may be an important tool for improving the impact and experience of CDS. CLINICALTRIAL Our flu alert trial was registered in January 2018 with ClinicalTrials.gov, registration number NCT03415425. Our tobacco alert trial was registered in October 2018 with ClinicalTrials.gov, registration number NCT03714191.


Author(s):  
Jessica S. Ancker ◽  
◽  
Alison Edwards ◽  
Sarah Nosal ◽  
Diane Hauser ◽  
...  

Following publication of the original article [1], the authors reported that the article erroneously stated that Dr. Ancker was affiliated with the Tehran University of Medical Sciences. Dr. Ancker is not affiliated with that institution.


Sign in / Sign up

Export Citation Format

Share Document