A Novel Machine Learning Sepsis Prediction Algorithm for Intended ICU Use (NAVOY Sepsis): A Proof-of-Concept Study (Preprint)

2021 ◽  
Author(s):  
Inger Persson ◽  
Andreas Östling ◽  
Martin Arlbrandt ◽  
Joakim Söderberg ◽  
David Becedas

BACKGROUND Despite decades of research, sepsis remains a leading cause of mortality and morbidity in ICUs worldwide. The key to effective management and patient outcome is early detection, where no prospectively validated machine learning prediction algorithm is available for clinical use in Europe today. OBJECTIVE To develop a high-performance machine learning sepsis prediction algorithm based on routinely collected ICU data, designed to be implemented in Europe. METHODS The machine learning algorithm is developed using Convolutional Neural Network, based on the Massachusetts Institute of Technology Lab for Computational Physiology MIMIC-III Clinical Database, focusing on ICU patients aged 18 years or older. Twenty variables are used for prediction, on an hourly basis. Onset of sepsis is defined in accordance with the international Sepsis-3 criteria. RESULTS The developed algorithm NAVOY Sepsis uses 4 hours of input and can with high accuracy predict patients with high risk of developing sepsis in the coming hours. The prediction performance is superior to that of existing sepsis early warning scoring systems, and competes well with previously published prediction algorithms designed to predict sepsis onset in accordance with the Sepsis-3 criteria, as measured by the area under the receiver operating characteristics curve (AUROC) and the area under the precision-recall curve (AUPRC). NAVOY Sepsis yields AUROC = 0.90 and AUPRC = 0.62 for predictions up to 3 hours before sepsis onset. The predictive performance is externally validated on hold-out test data, where NAVOY Sepsis is confirmed to predict sepsis with high accuracy. CONCLUSIONS An algorithm with excellent predictive properties has been developed, based on variables routinely collected at ICUs. This algorithm is to be further validated in an ongoing prospective randomized clinical trial and will be CE marked as Software as a Medical Device, designed for commercial use in European ICUs.

2017 ◽  
Author(s):  
Thomas Desautels ◽  
Jana Hoffman ◽  
Christopher Barton ◽  
Qingqing Mao ◽  
Melissa Jay ◽  
...  

Early detection of pediatric severe sepsis is necessary in order to administer effective treatment. In this study, we assessed the efficacy of a machine-learning-based prediction algorithm applied to electronic healthcare record (EHR) data for the prediction of severe sepsis onset. The resulting prediction performance was compared with the Pediatric Logistic Organ Dysfunction score (PELOD-2) and pediatric Systemic Inflammatory Response Syndrome score (SIRS) using cross-validation and pairwise t-tests. EHR data were collected from a retrospective set of de-identified pediatric inpatient and emergency encounters drawn from the University of California San Francisco (UCSF) Medical Center, with encounter dates between June 2011 and March 2016. Patients (n = 11,127) were 2-17 years of age and 103 [0.93%] were labeled severely septic. In four-fold cross-validation evaluations, the machine learning algorithm achieved an AUROC of 0.912 for discrimination between severely septic and control pediatric patients at onset and AUROC of 0.727 four hours before onset. Under the same measure, the prediction algorithm also significantly outperformed PELOD-2 (p < 0.05) and SIRS (p < 0.05) in the prediction of severe sepsis four hours before onset. This machine learning algorithm has the potential to deliver high-performance severe sepsis detection and prediction for pediatric inpatients.


Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 656
Author(s):  
Xavier Larriva-Novo ◽  
Víctor A. Villagrá ◽  
Mario Vega-Barbas ◽  
Diego Rivera ◽  
Mario Sanz Rodrigo

Security in IoT networks is currently mandatory, due to the high amount of data that has to be handled. These systems are vulnerable to several cybersecurity attacks, which are increasing in number and sophistication. Due to this reason, new intrusion detection techniques have to be developed, being as accurate as possible for these scenarios. Intrusion detection systems based on machine learning algorithms have already shown a high performance in terms of accuracy. This research proposes the study and evaluation of several preprocessing techniques based on traffic categorization for a machine learning neural network algorithm. This research uses for its evaluation two benchmark datasets, namely UGR16 and the UNSW-NB15, and one of the most used datasets, KDD99. The preprocessing techniques were evaluated in accordance with scalar and normalization functions. All of these preprocessing models were applied through different sets of characteristics based on a categorization composed by four groups of features: basic connection features, content characteristics, statistical characteristics and finally, a group which is composed by traffic-based features and connection direction-based traffic characteristics. The objective of this research is to evaluate this categorization by using various data preprocessing techniques to obtain the most accurate model. Our proposal shows that, by applying the categorization of network traffic and several preprocessing techniques, the accuracy can be enhanced by up to 45%. The preprocessing of a specific group of characteristics allows for greater accuracy, allowing the machine learning algorithm to correctly classify these parameters related to possible attacks.


Author(s):  
Olfa Hamdi-Larbi ◽  
Ichrak Mehrez ◽  
Thomas Dufaud

Many applications in scientific computing process very large sparse matrices on parallel architectures. The presented work in this paper is a part of a project where our general aim is to develop an auto-tuner system for the selection of the best matrix compression format in the context of high-performance computing. The target smart system can automatically select the best compression format for a given sparse matrix, a numerical method processing this matrix, a parallel programming model and a target architecture. Hence, this paper describes the design and implementation of the proposed concept. We consider a case study consisting of a numerical method reduced to the sparse matrix vector product (SpMV), some compression formats, the data parallel as a programming model and, a distributed multi-core platform as a target architecture. This study allows extracting a set of important novel metrics and parameters which are relative to the considered programming model. Our metrics are used as input to a machine-learning algorithm to predict the best matrix compression format. An experimental study targeting a distributed multi-core platform and processing random and real-world matrices shows that our system can improve in average up to 7% the accuracy of the machine learning.


2017 ◽  
Author(s):  
Hamid Mohamadlou ◽  
Anna Lynn-Palevsky ◽  
Christopher Barton ◽  
Uli Chettipally ◽  
Lisa Shieh ◽  
...  

AbstractBackgroundA major problem in treating acute kidney injury (AKI) is that clinical criteria for recognition are markers of established kidney damage or impaired function; treatment before such damage manifests is desirable. Clinicians could intervene during what may be a crucial stage for preventing permanent kidney injury if patients with incipient AKI and those at high risk of developing AKI could be identified.MethodsWe used a machine learning technique, boosted ensembles of decision trees, to train an AKI prediction tool on retrospective data from inpatients at Stanford Medical Center and intensive care unit patients at Beth Israel Deaconess Medical Center. We tested the algorithm’s ability to detect AKI at onset, and to predict AKI 12, 24, 48, and 72 hours before onset, and compared its 3-fold cross-validation performance to the SOFA score for AKI identification in terms of Area Under the Receiver Operating Characteristic (AUROC).ResultsThe prediction algorithm achieves AUROC of 0.872 (95% CI 0.867, 0.878) for AKI onset detection, superior to the SOFA score AUROC of 0.815 (P < 0.01). At 72 hours before onset, the algorithm achieves AUROC of 0.728 (95% CI 0.719, 0.737), compared to the SOFA score AUROC of 0.720 (P < 0.01).ConclusionsThe results of these experiments suggest that a machine-learning-based AKI prediction tool may offer important prognostic capabilities for determining which patients are likely to suffer AKI, potentially allowing clinicians to intervene before kidney damage manifests.


2021 ◽  
Vol 5 (3) ◽  
pp. 21
Author(s):  
Arsany Hakim ◽  
Benjamin Messerli ◽  
Raphael Meier ◽  
Tomas Dobrocky ◽  
Sebastian Bellwald ◽  
...  

(1) Background: To test the accuracy of a fully automated stroke tissue estimation algorithm (FASTER) to predict final lesion volumes in an independent dataset in patients with acute stroke; (2) Methods: Tissue-at-risk prediction was performed in 31 stroke patients presenting with a proximal middle cerebral artery occlusion. FDA-cleared perfusion software using the AHA recommendation for the Tmax threshold delay was tested against a prediction algorithm trained on an independent perfusion software using artificial intelligence (FASTER). Following our endovascular strategy to consequently achieve TICI 3 outcome, we compared patients with complete reperfusion (TICI 3) vs. no reperfusion (TICI 0) after mechanical thrombectomy. Final infarct volume was determined on a routine follow-up MRI or CT at 90 days after the stroke; (3) Results: Compared to the reference standard (infarct volume after 90 days), the decision forest algorithm overestimated the final infarct volume in patients without reperfusion. Underestimation was observed if patients were completely reperfused. In cases where the FDA-cleared segmentation was not interpretable due to improper definitions of the arterial input function, the decision forest provided reliable results; (4) Conclusions: The prediction accuracy of automated tissue estimation depends on (i) success of reperfusion, (ii) infarct size, and (iii) software-related factors introduced by the training sample. A principal advantage of machine learning algorithms is their improved robustness to artifacts in comparison to solely threshold-based model-dependent software. Validation on independent datasets remains a crucial condition for clinical implementations of decision support systems in stroke imaging.


2021 ◽  
Author(s):  
Sangil Lee ◽  
Brianna Mueller ◽  
W. Nick Street ◽  
Ryan M. Carnahan

AbstractIntroductionDelirium is a cerebral dysfunction seen commonly in the acute care setting. Delirium is associated with increased mortality and morbidity and is frequently missed in the emergency department (ED) by clinical gestalt alone. Identifying those at risk of delirium may help prioritize screening and interventions.ObjectiveOur objective was to identify clinically valuable predictive models for prevalent delirium within the first 24 hours of hospitalization based on the available data by assessing the performance of logistic regression and a variety of machine learning models.MethodsThis was a retrospective cohort study to develop and validate a predictive risk model to detect delirium using patient data obtained around an ED encounter. Data from electronic health records for patients hospitalized from the ED between January 1, 2014, and December 31, 2019, were extracted. Eligible patients were aged 65 or older, admitted to an inpatient unit from the emergency department, and had at least one DOSS assessment or CAM-ICU recorded while hospitalized. The outcome measure of this study was delirium within one day of hospitalization determined by a positive DOSS or CAM assessment. We developed the model with and without the Barthel index for activity of daily living, since this was measured after hospital admission.ResultsThe area under the ROC curves for delirium ranged from .69 to .77 without the Barthel index. Random forest and gradient-boosted machine showed the highest AUC of .77. At the 90% sensitivity threshold, gradient-boosted machine, random forest, and logistic regression achieved a specificity of 35%. After the Barthel index was included, random forest, gradient-boosted machine, and logistic regression models demonstrated the best predictive ability with respective AUCs of .85 to .86.ConclusionThis study demonstrated the use of machine learning algorithms to identify the combination of variables that are predictive of delirium within 24 hours of hospitalization from the ED.


2020 ◽  
Author(s):  
Chen-Zhi Su ◽  
Kuan-Ting Chou ◽  
Hsuan-Pei Huang ◽  
Chung-Chuan Lo ◽  
Daw-Wei Wang

AbstractIdentifying the directions of signal flows in neural networks is one of the most important stages for understanding the intricate information dynamics of a living brain. Using a dataset of 213 projection neurons distributed in different regions of a Drosophila brain, we develop a powerful machine learning algorithm: node-based polarity identifier of neurons (NPIN). The proposed model is trained by nodal information only and includes both Soma Features (which contain spatial information from a given node to a soma) and Local Features (which contain morphological information of a given node). After including the spatial correlations between nodal polarities, our NPIN provided extremely high accuracy (>96.0%) for the classification of neuronal polarity, even for complex neurons with more than two dendrite/axon clusters. Finally, we further apply NPIN to classify the neuronal polarity of the blowfly, which has much less neuronal data available. Our results demonstrate that NPIN is a powerful tool to identify the neuronal polarity of insects and to map out the signal flows in the brain’s neural networks.Availability of data and materialThe FlyCircuit database (http://www.flycircuit.tw/) is provided by the National Center for High-Performance Computing.Code availabilityWe provide an online version of NPIN to be used or tested by other research groups at the following address: https://npin-for-drosophila.herokuapp.com/


2021 ◽  
Author(s):  
Ekaterina Gurina ◽  
Ksenia Antipova ◽  
Nikita Klyuchnikov ◽  
Dmitry Koroteev

Abstract Drilling accidents prediction is the important task in well construction. Drilling support software allows observing the drilling parameters for multiple wells at the same time and artificial intelligence helps detecting the drilling accident predecessor ahead the emergency situation. We present machine learning (ML) algorithm for prediction of such accidents as stuck, mud loss, fluid show, washout, break of drill string and shale collar. The model for forecasting the drilling accidents is based on the "Bag-of-features" approach, which implies the use of distributions of the directly recorded data as the main features. Bag-of-features implies the labeling of small parts of data by the particular symbol, named codeword. Building histograms of symbols for the data segment, one could use the histogram as an input for the machine learning algorithm. Fragments of real-time mud log data were used to create the model. We define more than 1000 drilling accident predecessors for more than 60 real accidents and about 2500 normal drilling cases as a training set for ML model. The developed model analyzes real-time mud log data and calculates the probability of accident. The result is presented as a probability curve for each type of accident; if the critical probability value is exceeded, the user is notified of the risk of an accident. The Bag-of-features model shows high performance by validation both on historical data and in real time. The prediction quality does not vary field to field and could be used in different fields without additional training of the ML model. The software utilizing the ML model has microservice architecture and is integrated with the WITSML data server. It is capable of real-time accidents forecasting without human intervention. As a result, the system notifies the user in all cases when the situation in the well becomes similar to the pre-accident one, and the engineer has enough time to take the necessary actions to prevent an accident.


Sign in / Sign up

Export Citation Format

Share Document