scholarly journals Machine Learning for Modeling the Singular Multi-Pantograph Equations

Entropy ◽  
2020 ◽  
Vol 22 (9) ◽  
pp. 1041
Author(s):  
Amirhosein Mosavi ◽  
Manouchehr Shokri ◽  
Zulkefli Mansor ◽  
Sultan Noman Qasem ◽  
Shahab S. Band ◽  
...  

In this study, a new approach to basis of intelligent systems and machine learning algorithms is introduced for solving singular multi-pantograph differential equations (SMDEs). For the first time, a type-2 fuzzy logic based approach is formulated to find an approximated solution. The rules of the suggested type-2 fuzzy logic system (T2-FLS) are optimized by the square root cubature Kalman filter (SCKF) such that the proposed fineness function to be minimized. Furthermore, the stability and boundedness of the estimation error is proved by novel approach on basis of Lyapunov theorem. The accuracy and robustness of the suggested algorithm is verified by several statistical examinations. It is shown that the suggested method results in an accurate solution with rapid convergence and a lower computational cost.

2021 ◽  
Vol 13 (3) ◽  
pp. 63
Author(s):  
Maghsoud Morshedi ◽  
Josef Noll

Video conferencing services based on web real-time communication (WebRTC) protocol are growing in popularity among Internet users as multi-platform solutions enabling interactive communication from anywhere, especially during this pandemic era. Meanwhile, Internet service providers (ISPs) have deployed fiber links and customer premises equipment that operate according to recent 802.11ac/ax standards and promise users the ability to establish uninterrupted video conferencing calls with ultra-high-definition video and audio quality. However, the best-effort nature of 802.11 networks and the high variability of wireless medium conditions hinder users experiencing uninterrupted high-quality video conferencing. This paper presents a novel approach to estimate the perceived quality of service (PQoS) of video conferencing using only 802.11-specific network performance parameters collected from Wi-Fi access points (APs) on customer premises. This study produced datasets comprising 802.11-specific network performance parameters collected from off-the-shelf Wi-Fi APs operating at 802.11g/n/ac/ax standards on both 2.4 and 5 GHz frequency bands to train machine learning algorithms. In this way, we achieved classification accuracies of 92–98% in estimating the level of PQoS of video conferencing services on various Wi-Fi networks. To efficiently troubleshoot wireless issues, we further analyzed the machine learning model to correlate features in the model with the root cause of quality degradation. Thus, ISPs can utilize the approach presented in this study to provide predictable and measurable wireless quality by implementing a non-intrusive quality monitoring approach in the form of edge computing that preserves customers’ privacy while reducing the operational costs of monitoring and data analytics.


2019 ◽  
Author(s):  
Lei Zhang ◽  
Xianwen Shang ◽  
Subhashaan Sreedharan ◽  
Xixi Yan ◽  
Jianbin Liu ◽  
...  

BACKGROUND Previous conventional models for the prediction of diabetes could be updated by incorporating the increasing amount of health data available and new risk prediction methodology. OBJECTIVE We aimed to develop a substantially improved diabetes risk prediction model using sophisticated machine-learning algorithms based on a large retrospective population cohort of over 230,000 people who were enrolled in the study during 2006-2017. METHODS We collected demographic, medical, behavioral, and incidence data for type 2 diabetes mellitus (T2DM) in over 236,684 diabetes-free participants recruited from the 45 and Up Study. We predicted and compared the risk of diabetes onset in these participants at 3, 5, 7, and 10 years based on three machine-learning approaches and the conventional regression model. RESULTS Overall, 6.05% (14,313/236,684) of the participants developed T2DM during an average 8.8-year follow-up period. The 10-year diabetes incidence in men was 8.30% (8.08%-8.49%), which was significantly higher (odds ratio 1.37, 95% CI 1.32-1.41) than that in women at 6.20% (6.00%-6.40%). The incidence of T2DM was doubled in individuals with obesity (men: 17.78% [17.05%-18.43%]; women: 14.59% [13.99%-15.17%]) compared with that of nonobese individuals. The gradient boosting machine model showed the best performance among the four models (area under the curve of 79% in 3-year prediction and 75% in 10-year prediction). All machine-learning models predicted BMI as the most significant factor contributing to diabetes onset, which explained 12%-50% of the variance in the prediction of diabetes. The model predicted that if BMI in obese and overweight participants could be hypothetically reduced to a healthy range, the 10-year probability of diabetes onset would be significantly reduced from 8.3% to 2.8% (<i>P</i>&lt;.001). CONCLUSIONS A one-time self-reported survey can accurately predict the risk of diabetes using a machine-learning approach. Achieving a healthy BMI can significantly reduce the risk of developing T2DM.


Author(s):  
Namrata Dhanda ◽  
Stuti Shukla Datta ◽  
Mudrika Dhanda

Human intelligence is deeply involved in creating efficient and faster systems that can work independently. Creation of such smart systems requires efficient training algorithms. Thus, the aim of this chapter is to introduce the readers with the concept of machine learning and the commonly employed learning algorithm for developing efficient and intelligent systems. The chapter gives a clear distinction between supervised and unsupervised learning methods. Each algorithm is explained with the help of suitable example to give an insight to the learning process.


2018 ◽  
Author(s):  
Adam Hakim ◽  
Shira Klorfeld ◽  
Tal Sela ◽  
Doron Friedman ◽  
Maytal Shabat-Simon ◽  
...  

AbstractA basic aim of marketing research is to predict consumers’ preferences and the success of marketing campaigns in the general population. However, traditional behavioral measurements have various limitations, calling for novel measurements to improve predictive power. In this study, we use neural signals measured with electroencephalography (EEG) in order to overcome these limitations. We record the EEG signals of subjects, as they watched commercials of six food products. We introduce a novel approach in which instead of using one type of EEG measure, we combine several measures, and use state-of-the-art machine learning algorithms to predict subjects’ individual future preferences over the products and the commercials’ population success, as measured by their YouTube metrics. As a benchmark, we acquired measurements of the commercials’ effectiveness using a standard questionnaire commonly used in marketing research. We reached 68.5% accuracy in predicting between the most and least preferred items and a lower than chance RMSE score for predicting the rank order preferences of all six products. We also predicted the commercials’ population success better than chance. Most importantly, we demonstrate for the first time, that for all of our predictions, the EEG measurements increased the prediction power of the questionnaires. Our analyses methods and results show great promise for utilizing EEG measures by managers, marketing practitioners, and researchers, as a valuable tool for predicting subjects’ preferences and marketing campaigns’ success.


PLoS ONE ◽  
2021 ◽  
Vol 16 (5) ◽  
pp. e0252104
Author(s):  
Saeed Mian Qaisar

Significant losses can occur for various smart grid stake holders due to the Power Quality Disturbances (PQDs). Therefore, it is necessary to correctly recognize and timely mitigate the PQDs. In this context, an emerging trend is the development of machine learning assisted PQDs management. Based on the conventional processing theory, the existing PQDs identification is time-invariant. It can result in a huge amount of unnecessary information being collected, processed, and transmitted. Consequently, needless processing activities, power consumption and latency can occur. In this paper, a novel combination of signal-piloted acquisition, adaptive-rate segmentation and time-domain features extraction with machine learning tools is suggested. The signal-piloted acquisition and processing brings real-time compression. Therefore, a remarkable reduction can be secured in the data storage, processing and transmission requirement towards the post classifier. Additionally, a reduced computational cost and latency of classifier is promised. The classification is accomplished by using robust machine learning algorithms. A comparison is made among the k-Nearest Neighbor, Naïve Bayes, Artificial Neural Network and Support Vector Machine. Multiple metrics are used to test the success of classification. It permits to avoid any biasness of findings. The applicability of the suggested approach is studied for automated recognition of the power signal’s major voltage and transient disturbances. Results show that the system attains a 6.75-fold reduction in the collected information and the processing load and secures the 98.05% accuracy of classification.


2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Jian Jiang ◽  
Fen Zhang

As the planet watches in shock the evolution of the COVID-19 pandemic, new forms of sophisticated, versatile, and extremely difficult-to-detect malware expose society and especially the global economy. Machine learning techniques are posing an increasingly important role in the field of malware identification and analysis. However, due to the complexity of the problem, the training of intelligent systems proves to be insufficient in recognizing advanced cyberthreats. The biggest challenge in information systems security using machine learning methods is to understand the polymorphism and metamorphism mechanisms used by malware developers and how to effectively address them. This work presents an innovative Artificial Evolutionary Fuzzy LSTM Immune System which, by using a heuristic machine learning method that combines evolutionary intelligence, Long-Short-Term Memory (LSTM), and fuzzy knowledge, proves to be able to adequately protect modern information system from Portable Executable Malware. The main innovation in the technical implementation of the proposed approach is the fact that the machine learning system can only be trained from raw bytes of an executable file to determine if the file is malicious. The performance of the proposed system was tested on a sophisticated dataset of high complexity, which emerged after extensive research on PE malware that offered us a realistic representation of their operating states. The high accuracy of the developed model significantly supports the validity of the proposed method. The final evaluation was carried out with in-depth comparisons to corresponding machine learning algorithms and it has revealed the superiority of the proposed immune system.


Sign in / Sign up

Export Citation Format

Share Document