scholarly journals Detection of Atrial Fibrillation Using a Machine Learning Approach

Information ◽  
2020 ◽  
Vol 11 (12) ◽  
pp. 549
Author(s):  
Sidrah Liaqat ◽  
Kia Dashtipour ◽  
Adnan Zahid ◽  
Khaled Assaleh ◽  
Kamran Arshad ◽  
...  

The atrial fibrillation (AF) is one of the most well-known cardiac arrhythmias in clinical practice, with a prevalence of 1–2% in the community, which can increase the risk of stroke and myocardial infarction. The detection of AF electrocardiogram (ECG) can improve the early detection of diagnosis. In this paper, we have further developed a framework for processing the ECG signal in order to determine the AF episodes. We have implemented machine learning and deep learning algorithms to detect AF. Moreover, the experimental results show that better performance can be achieved with long short-term memory (LSTM) as compared to other algorithms. The initial experimental results illustrate that the deep learning algorithms, such as LSTM and convolutional neural network (CNN), achieved better performance (10%) as compared to machine learning classifiers, such as support vectors, logistic regression, etc. This preliminary work can help clinicians in AF detection with high accuracy and less probability of errors, which can ultimately result in reduction in fatality rate.

10.6036/10007 ◽  
2021 ◽  
Vol 96 (5) ◽  
pp. 528-533
Author(s):  
XAVIER LARRIVA NOVO ◽  
MARIO VEGA BARBAS ◽  
VICTOR VILLAGRA ◽  
JULIO BERROCAL

Cybersecurity has stood out in recent years with the aim of protecting information systems. Different methods, techniques and tools have been used to make the most of the existing vulnerabilities in these systems. Therefore, it is essential to develop and improve new technologies, as well as intrusion detection systems that allow detecting possible threats. However, the use of these technologies requires highly qualified cybersecurity personnel to analyze the results and reduce the large number of false positives that these technologies presents in their results. Therefore, this generates the need to research and develop new high-performance cybersecurity systems that allow efficient analysis and resolution of these results. This research presents the application of machine learning techniques to classify real traffic, in order to identify possible attacks. The study has been carried out using machine learning tools applying deep learning algorithms such as multi-layer perceptron and long-short-term-memory. Additionally, this document presents a comparison between the results obtained by applying the aforementioned algorithms and algorithms that are not deep learning, such as: random forest and decision tree. Finally, the results obtained are presented, showing that the long-short-term-memory algorithm is the one that provides the best results in relation to precision and logarithmic loss.


Author(s):  
Dyapa Sravan Reddy ◽  
Lakshmi Prasanna Reddy ◽  
Kandibanda Sai Santhosh ◽  
Virrat Devaser

SEO Analyst pays a lot of time finding relevant tags for their articles and in some cases, they are unaware of the content topics. The current proposed ML model will recommend content-related tags so that the Content writers/SEO analyst will be having an overview regarding the content and minimizes their time spent on unknown articles. Machine Learning algorithms have a plethora of applications and the extent of their real-life implementations cannot be estimated. Using algorithms like One vs Rest (OVR), Long Short-Term Memory (LSTM), this study has analyzed how Machine Learning can be useful for tag suggestions for a topic. The training of the model with One vs Rest turned out to deliver more accurate results than others. This Study certainly answers how One vs Rest is used for tag suggestions that are needed to promote a website and further studies are required to suggest keywords required.


Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-18
Author(s):  
Hasan Alkahtani ◽  
Theyazn H. H. Aldhyani

Smart grids, advanced information technology, have become the favored intrusion targets due to the Internet of Things (IoT) using sensor devices to collect data from a smart grid environment. These data are sent to the cloud, which is a huge network of super servers that provides different services to different smart infrastructures, such as smart homes and smart buildings. These can provide a large space for attackers to launch destructive cyberattacks. The novelty of this proposed research is the development of a robust framework system for detecting intrusions based on the IoT environment. An IoTID20 dataset attack was employed to develop the proposed system; it is a newly generated dataset from the IoT infrastructure. In this framework, three advanced deep learning algorithms were applied to classify the intrusion: a convolution neural network (CNN), a long short-term memory (LSTM), and a hybrid convolution neural network with the long short-term memory (CNN-LSTM) model. The complexity of the network dataset was dimensionality reduced, and to improve the proposed system, the particle swarm optimization method (PSO) was used to select relevant features from the network dataset. The obtained features were processed using deep learning algorithms. The experimental results showed that the proposed systems achieved accuracy as follows: CNN = 96.60%, LSTM = 99.82%, and CNN-LSTM = 98.80%. The proposed framework attained the desired performance on a new variable dataset, and the system will be implemented in our university IoT environment. The results of comparative predictions between the proposed framework and existing systems showed that the proposed system more efficiently and effectively enhanced the security of the IoT environment from attacks. The experimental results confirmed that the proposed framework based on deep learning algorithms for an intrusion detection system can effectively detect real-world attacks and is capable of enhancing the security of the IoT environment.


2019 ◽  
Vol 19 (01) ◽  
pp. 1940005 ◽  
Author(s):  
ULAS BARAN BALOGLU ◽  
ÖZAL YILDIRIM

Background and objective: Deep learning structures have recently achieved remarkable success in the field of machine learning. Convolutional neural networks (CNN) in image processing and long-short term memory (LSTM) in the time-series analysis are commonly used deep learning algorithms. Healthcare applications of deep learning algorithms provide important contributions for computer-aided diagnosis research. In this study, convolutional long-short term memory (CLSTM) network was used for automatic classification of EEG signals and automatic seizure detection. Methods: A new nine-layer deep network model consisting of convolutional and LSTM layers was designed. The signals processed in the convolutional layers were given as an input to the LSTM network whose outputs were processed in densely connected neural network layers. The EEG data is appropriate for a model having 1-D convolution layers. A bidirectional model was employed in the LSTM layer. Results: Bonn University EEG database with five different datasets was used for experimental studies. In this database, each dataset contains 23.6[Formula: see text]s duration 100 single channel EEG segments which consist of 4097 dimensional samples (173.61[Formula: see text]Hz). Eight two-class and three three-class clinical scenarios were examined. When the experimental results were evaluated, it was seen that the proposed model had high accuracy on both binary and ternary classification tasks. Conclusions: The proposed end-to-end learning structure showed a good performance without using any hand-crafted feature extraction or shallow classifiers to detect the seizures. The model does not require filtering, and also automatically learns to filter the input as well. As a result, the proposed model can process long duration EEG signals without applying segmentation, and can detect epileptic seizures automatically by using the correlation of ictal and interictal signals of raw data.


2021 ◽  
Vol 10 (11) ◽  
pp. e33101119347
Author(s):  
Ewethon Dyego de Araujo Batista ◽  
Wellington Candeia de Araújo ◽  
Romeryto Vieira Lira ◽  
Laryssa Izabel de Araujo Batista

Introdução: a dengue é uma arbovirose causada pelo vírus DENV e transmitida para o homem através do mosquito Aedes aegypti. Atualmente, não existe uma vacina eficaz para combater todas as sorologias do vírus. Diante disso, o combate à doença se volta para medidas preventivas contra a proliferação do mosquito. Os pesquisadores estão utilizando Machine Learning (ML) e Deep Learning (DL) como ferramentas para prever casos de dengue e ajudar os governantes nesse combate. Objetivo: identificar quais técnicas e abordagens de ML e de DL estão sendo utilizadas na previsão de dengue. Métodos: revisão sistemática realizada nas bases das áreas de Medicina e de Computação com intuito de responder as perguntas de pesquisa: é possível realizar previsões de casos de dengue através de técnicas de ML e de DL, quais técnicas são utilizadas, onde os estudos estão sendo realizados, como e quais dados estão sendo utilizados? Resultados: após realizar as buscas, aplicar os critérios de inclusão, exclusão e leitura aprofundada, 14 artigos foram aprovados. As técnicas Random Forest (RF), Support Vector Regression (SVR), e Long Short-Term Memory (LSTM) estão presentes em 85% dos trabalhos. Em relação aos dados, na maioria, foram utilizados 10 anos de dados históricos da doença e informações climáticas. Por fim, a técnica Root Mean Absolute Error (RMSE) foi a preferida para mensurar o erro. Conclusão: a revisão evidenciou a viabilidade da utilização de técnicas de ML e de DL para a previsão de casos de dengue, com baixa taxa de erro e validada através de técnicas estatísticas.


2021 ◽  
Vol 11 (1) ◽  
pp. 61-67
Author(s):  
Watthana Pongsena ◽  
◽  
Prakaidoy Sitsayabut ◽  
Nittaya Kerdprasop ◽  
Kittisak Kerdprasop ◽  
...  

Forex is the largest global financial market in the world. Traditionally, fundamental and technical analysis are strategies that the Forex traders often used. Nowadays, advanced computational technology, Artificial Intelligence (AI) has played a significant role in the financial domain. Various applications based on AI technologies particularly machine learning and deep learning have been constantly developed. As the historical data of the Forex are time-series data where the values from the past affect the values that will appear in the future. Several existing works from other domains of applications have proved that the Long-Short Term Memory (LSTM), which is a particular kind of deep learning that can be applied to modeling time series, provides better performance than traditional machine learning algorithms. In this paper, we aim to develop a powerful predictive model targeting to predicts the daily price changes of the currency pairwise in the Forex market using LSTM. Besides, we also conduct an extensive experiment with the intention to demonstrate the effect of various factors contributing to the performance of the model. The experimental results show that the optimized LSTM model accurately predicts the direction of the future price up to 61.25 percent.


2019 ◽  
Vol 26 (11) ◽  
pp. 1247-1254 ◽  
Author(s):  
Michel Oleynik ◽  
Amila Kugic ◽  
Zdenko Kasáč ◽  
Markus Kreuzthaler

Abstract Objective Automated clinical phenotyping is challenging because word-based features quickly turn it into a high-dimensional problem, in which the small, privacy-restricted, training datasets might lead to overfitting. Pretrained embeddings might solve this issue by reusing input representation schemes trained on a larger dataset. We sought to evaluate shallow and deep learning text classifiers and the impact of pretrained embeddings in a small clinical dataset. Materials and Methods We participated in the 2018 National NLP Clinical Challenges (n2c2) Shared Task on cohort selection and received an annotated dataset with medical narratives of 202 patients for multilabel binary text classification. We set our baseline to a majority classifier, to which we compared a rule-based classifier and orthogonal machine learning strategies: support vector machines, logistic regression, and long short-term memory neural networks. We evaluated logistic regression and long short-term memory using both self-trained and pretrained BioWordVec word embeddings as input representation schemes. Results Rule-based classifier showed the highest overall micro F1 score (0.9100), with which we finished first in the challenge. Shallow machine learning strategies showed lower overall micro F1 scores, but still higher than deep learning strategies and the baseline. We could not show a difference in classification efficiency between self-trained and pretrained embeddings. Discussion Clinical context, negation, and value-based criteria hindered shallow machine learning approaches, while deep learning strategies could not capture the term diversity due to the small training dataset. Conclusion Shallow methods for clinical phenotyping can still outperform deep learning methods in small imbalanced data, even when supported by pretrained embeddings.


2021 ◽  
Vol 7 (2) ◽  
pp. 113-121
Author(s):  
Firman Pradana Rachman

Setiap orang mempunyai pendapat atau opini terhadap suatu produk, tokoh masyarakat, atau pun sebuah kebijakan pemerintah yang tersebar di media sosial. Pengolahan data opini itu di sebut dengan sentiment analysis. Dalam pengolahan data opini yang besar tersebut tidak hanya cukup menggunakan machine learning, namun bisa juga menggunakan deep learning yang di kombinasikan dengan teknik NLP (Natural Languange Processing). Penelitian ini membandingkan beberapa model deep learning seperti CNN (Convolutional Neural Network), RNN (Recurrent Neural Networks), LSTM (Long Short-Term Memory) dan beberapa variannya untuk mengolah data sentiment analysis dari review produk amazon dan yelp.


Sign in / Sign up

Export Citation Format

Share Document