scholarly journals Diagnosing Various Severity Levels of Congestive Heart Failure Based on Long-Term HRV Signal

2019 ◽  
Vol 9 (12) ◽  
pp. 2544 ◽  
Author(s):  
Hua ◽  
Chen ◽  
Zhang ◽  
Liu ◽  
Wen

Previous studies have attempted to find autonomic differences of the cardiac system between the congestive heart failure (CHF) disease and healthy groups using a variety of algorithms of pattern recognition. By comparing previous literature, we have found that there are two shortcomings: 1) Previous studies have focused on improving the accuracy of models, but the number of features used has mostly exceeded 10, leading to poor generalization performance; 2) Previous works rarely distinguish the severity levels of CHF disease. In order to make up for these two shortcomings, we proposed two models: model A was used for distinguishing CHF patients from the normal people; model B was used for diagnosing the four severity levels of CHF disease. Based on long-term heart rate variability (HRV) (40000 intervals–8h) signals, we extracted linear and non-linear features from the inter-beat-interval (IBI) series. After that, the sequence forward selection algorithm (SFS) reduced the feature dimension. Finally, models with the best performance were selected through the leave-one-subject-out validation. For a total of 113 samples of the dataset, we applied the support vector machine classifier and five HRV features for CHF discrimination and obtained an accuracy of 97.35%. For a total of 41 samples of the dataset, we applied k-nearest-neighbor (K = 1) classifier and four HRV features for diagnosing four severity levels of CHF disease and got an accuracy of 87.80%. The contribution in this work was to use the fewer features to optimize our models by the leave-one-subject-out validation. The relatively good generalization performance of our models indicated their value in clinical application.

2020 ◽  
Vol 13 (12) ◽  
pp. 3873-3894
Author(s):  
Sina Shokoohyar ◽  
Ahmad Sobhani ◽  
Anae Sobhani

Purpose Short-term rental option enabled via accommodation sharing platforms is an attractive alternative to conventional long-term rental. The purpose of this study is to compare rental strategies (short-term vs long-term) and explore the main determinants for strategy selection. Design/methodology/approach Using logistic regression, this study predicts the rental strategy with the highest rate of return for a given property in the City of Philadelphia. The modeling result is then compared with the applied machine learning methods, including random forest, k-nearest neighbor, support vector machine, naïve Bayes and neural networks. The best model is finally selected based on different performance metrics that determine the prediction strength of underlying models. Findings By analyzing 2,163 properties, the results show that properties with more bedrooms, closer to the historic attractions, in neighborhoods with lower minority rates and higher nightlife vibe are more likely to have a higher return if they are rented out through short-term rental contract. Additionally, the property location is found out to have a significant impact on the selection of the rental strategy, which emphasizes the widely known term of “location, location, location” in the real estate market. Originality/value The findings of this study contribute to the literature by determining the neighborhood and property characteristics that make a property more suitable for the short-term rental vs the long-term one. This contribution is extremely important as it facilitates differentiating the short-term rentals from the long-term rentals and would help better understanding the supply-side in the sharing economy-based accommodation market.


2016 ◽  
Vol 24 (2) ◽  
pp. 361-370 ◽  
Author(s):  
Edward Choi ◽  
Andy Schuetz ◽  
Walter F Stewart ◽  
Jimeng Sun

Objective: We explored whether use of deep learning to model temporal relations among events in electronic health records (EHRs) would improve model performance in predicting initial diagnosis of heart failure (HF) compared to conventional methods that ignore temporality. Materials and Methods: Data were from a health system’s EHR on 3884 incident HF cases and 28 903 controls, identified as primary care patients, between May 16, 2000, and May 23, 2013. Recurrent neural network (RNN) models using gated recurrent units (GRUs) were adapted to detect relations among time-stamped events (eg, disease diagnosis, medication orders, procedure orders, etc.) with a 12- to 18-month observation window of cases and controls. Model performance metrics were compared to regularized logistic regression, neural network, support vector machine, and K-nearest neighbor classifier approaches. Results: Using a 12-month observation window, the area under the curve (AUC) for the RNN model was 0.777, compared to AUCs for logistic regression (0.747), multilayer perceptron (MLP) with 1 hidden layer (0.765), support vector machine (SVM) (0.743), and K-nearest neighbor (KNN) (0.730). When using an 18-month observation window, the AUC for the RNN model increased to 0.883 and was significantly higher than the 0.834 AUC for the best of the baseline methods (MLP). Conclusion: Deep learning models adapted to leverage temporal relations appear to improve performance of models for detection of incident heart failure with a short observation window of 12–18 months.


Author(s):  
Marzieh Masoumi ◽  
Ahmad Keshavarz

Nowadays, speed up development and use of digital devices such as smartphones have put people at risk of internet crimes. The evidence of present crimes in a computer file can be easily unreachable by changing the prefix of a file or other algorithms. In more complex cases, either file divided into different parts or the parts of a file that has information about the file type are deleted, where the file fragment recognition issue is discussed. The known files are divided into different fragments, and different classification algorithms to solve the problems of file fragment recognition. A confusion matrix measures the accuracy of type recognition. In the present study, first, the file is divided into different fragments. Then, the file fragment features, which are obtained from Binary Frequency Distribution (BFD), are reduced by 2 feature reduction algorithms; Sequential Forward Selection algorithm (SFS) as well as Sequential Floating Forward Selection algorithm (SFFS) to delete sparse features that result in increased accuracy and speed. Finally, the reduced features are given to 3 classifier algorithms, Multilayer Perceptron (MLP), Support Vector Machines (SVM), and K-Nearest Neighbor (KNN) for classification and comparison of the results. In this paper, we proposed the algorithm of file type recognition that can recognize 6 types of useful files ( pdf, txt, jpg, doc, html, exe), which may distinguish a type of file fragments with higher accuracy than the similar works done.


In this study, computer-aided detection (CADe) system is optimized to reduce radiologists’ workload and to improve accuracy of cancer detection by providing more quantitative (objective) decisions added to the qualitative (subjective) assessment of radiologists. The images have been collected from MIAS database. 3 databases were prepared by 3 different ROIs sizes (32x32, 42x42 & 52x52 pixels). Then, prepressing is done to enhance the peripheral of ROIs. This CADe computed parametric features from ROIs using statistics, histogram, GLCM and wavelet techniques. Sequential Forward Selection (SFS) technique is used to study the significance of features and eventually to omit redundancies. Several types of K-Nearest Neighbor (KNN) and Support Vector Machine (SVM) classifiers were trained to differentiate between normal and abnormal ROIs, then tested on another non-training set. Best overall performance results obtained with ROI size of 32x32 and histogram of 32 levels (Accuracy = 97.37%, Sensitivity= 95%, Specificity = 100%, PPV = 100% and NPV = 94.74). The results also indicate some useful features are well-representing to abnormalities across different classifiers such as: Mean, STD, Square of STD, Mode, Median, Quantile (10%), Quantile (70%), Quantile (90%), Percentile (30%), throughout multiple histogram levels both in spatial and DWT spaces.


Author(s):  
Marzieh Masoumi ◽  
Ahmad Keshavarz

Nowadays, speed up development and use of digital devices such as smartphones have put people at risk of internet crimes. The evidence of present crimes in a computer file can be easily unreachable by changing the prefix of a file or other algorithms. In more complex cases, either file divided into different parts or the parts of a file that has information about the file type are deleted, where the file fragment recognition issue is discussed. The known files are divided into different fragments, and different classification algorithms to solve the problems of file fragment recognition. A confusion matrix measures the accuracy of type recognition. In the present study, first, the file is divided into different fragments. Then, the file fragment features, which are obtained from Binary Frequency Distribution (BFD), are reduced by 2 feature reduction algorithms; Sequential Forward Selection algorithm (SFS) as well as Sequential Floating Forward Selection algorithm (SFFS) to delete sparse features that result in increased accuracy and speed. Finally, the reduced features are given to 3 classifier algorithms, Multilayer Perceptron (MLP), Support Vector Machines (SVM), and K-Nearest Neighbor (KNN) for classification and comparison of the results. In this paper, we proposed the algorithm of file type recognition that can recognize 6 types of useful files ( pdf, txt, jpg, doc, html, exe), which may distinguish a type of file fragments with higher accuracy than the similar works done.


2020 ◽  
Vol 2 (1) ◽  
pp. 85
Author(s):  
Dilana Hazer-Rau ◽  
Lin Zhang ◽  
Harald C. Traue

Affective computing and stress recognition from biosignals have a high potential in various medical applications such as early intervention, stress management and risk prevention, as well as monitoring individuals’ mental health. This paper presents an automated processing workflow for the psychophysiological recognition of emotion and stress states. Our proposed workflow allows the processing of biosignals in their raw state as obtained from wearable sensors. It consists of five stages: (1) Biosignal Preprocessing—raw data conversion and physiological data triggering, relevant information selection, artifact and noise filtering; (2) Feature Extraction—using different mathematical groups including amplitude, frequency, linearity, stationarity, entropy and variability, as well as cardiovascular-specific characteristics; (3) Feature Selection—dimension reduction and computation optimization using Forward Selection, Backward Elimination and Brute Force methods; (4) Affect Classification—machine learning using Support Vector Machine, Random Forest and k-Nearest Neighbor algorithms; (5) Model Validation—performance matrix computation using k-Cross, Leave-One-Subject-Out and Split Validations. All workflow stages are integrated into embedded functions and operators, allowing an automated execution of the recognition process. The next steps include further development of the algorithms and the integration of the developed tools into an easy-to-use system, thereby satisfying the needs of medical and psychological staff. Our automated workflow was evaluated using our uulmMAC database, previously developed for affective computing and machine learning applications in human–computer interaction.


Sign in / Sign up

Export Citation Format

Share Document