scholarly journals Competency of Neural Networks for the Numerical Treatment of Nonlinear Host-Vector-Predator Model

2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Zulqurnain Sabir ◽  
Muhammad Umar ◽  
Ghulam Mujtaba Shah ◽  
Hafiz Abdul Wahab ◽  
Yolanda Guerrero Sánchez

The aim of this work is to introduce a stochastic solver based on the Levenberg-Marquardt backpropagation neural networks (LMBNNs) for the nonlinear host-vector-predator model. The nonlinear host-vector-predator model is dependent upon five classes, susceptible/infected populations of host plant, susceptible/infected vectors population, and population of predator. The numerical performances through the LMBNN solver are observed for three different types of the nonlinear host-vector-predator model using the authentication, testing, sample data, and training. The proportions of these data are chosen as a larger part, i.e., 80% for training and 10% for validation and testing, respectively. The nonlinear host-vector-predator model is numerically treated through the LMBNNs, and comparative investigations have been performed using the reference solutions. The obtained results of the model are presented using the LMBNNs to reduce the mean square error (MSE). For the competence, exactness, consistency, and efficacy of the LMBNNs, the numerical results using the proportional measures through the MSE, error histograms (EHs), and regression/correlation are performed.

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Jingwei Liu ◽  
Peixuan Li ◽  
Xuehan Tang ◽  
Jiaxin Li ◽  
Jiaming Chen

AbstractArtificial neural networks (ANN) which include deep learning neural networks (DNN) have problems such as the local minimal problem of Back propagation neural network (BPNN), the unstable problem of Radial basis function neural network (RBFNN) and the limited maximum precision problem of Convolutional neural network (CNN). Performance (training speed, precision, etc.) of BPNN, RBFNN and CNN are expected to be improved. Main works are as follows: Firstly, based on existing BPNN and RBFNN, Wavelet neural network (WNN) is implemented in order to get better performance for further improving CNN. WNN adopts the network structure of BPNN in order to get faster training speed. WNN adopts the wavelet function as an activation function, whose form is similar to the radial basis function of RBFNN, in order to solve the local minimum problem. Secondly, WNN-based Convolutional wavelet neural network (CWNN) method is proposed, in which the fully connected layers (FCL) of CNN is replaced by WNN. Thirdly, comparative simulations based on MNIST and CIFAR-10 datasets among the discussed methods of BPNN, RBFNN, CNN and CWNN are implemented and analyzed. Fourthly, the wavelet-based Convolutional Neural Network (WCNN) is proposed, where the wavelet transformation is adopted as the activation function in Convolutional Pool Neural Network (CPNN) of CNN. Fifthly, simulations based on CWNN are implemented and analyzed on the MNIST dataset. Effects are as follows: Firstly, WNN can solve the problems of BPNN and RBFNN and have better performance. Secondly, the proposed CWNN can reduce the mean square error and the error rate of CNN, which means CWNN has better maximum precision than CNN. Thirdly, the proposed WCNN can reduce the mean square error and the error rate of CWNN, which means WCNN has better maximum precision than CWNN.


Mathematics ◽  
2020 ◽  
Vol 8 (5) ◽  
pp. 815 ◽  
Author(s):  
Usa Humphries ◽  
Grienggrai Rajchakit ◽  
Pramet Kaewmesri ◽  
Pharunyou Chanthorn ◽  
Ramalingam Sriraman ◽  
...  

In this paper, we study the mean-square exponential input-to-state stability (exp-ISS) problem for a new class of neural network (NN) models, i.e., continuous-time stochastic memristive quaternion-valued neural networks (SMQVNNs) with time delays. Firstly, in order to overcome the difficulties posed by non-commutative quaternion multiplication, we decompose the original SMQVNNs into four real-valued models. Secondly, by constructing suitable Lyapunov functional and applying It o ^ ’s formula, Dynkin’s formula as well as inequity techniques, we prove that the considered system model is mean-square exp-ISS. In comparison with the conventional research on stability, we derive a new mean-square exp-ISS criterion for SMQVNNs. The results obtained in this paper are the general case of previously known results in complex and real fields. Finally, a numerical example has been provided to show the effectiveness of the obtained theoretical results.


2016 ◽  
Vol 2016 ◽  
pp. 1-19 ◽  
Author(s):  
Chuangxia Huang ◽  
Jie Cao ◽  
Peng Wang

We address the problem of stochastic attractor and boundedness of a class of switched Cohen-Grossberg neural networks (CGNN) with discrete and infinitely distributed delays. With the help of stochastic analysis technology, the Lyapunov-Krasovskii functional method, linear matrix inequalities technique (LMI), and the average dwell time approach (ADT), some novel sufficient conditions regarding the issues of mean-square uniformly ultimate boundedness, the existence of a stochastic attractor, and the mean-square exponential stability for the switched Cohen-Grossberg neural networks are established. Finally, illustrative examples and their simulations are provided to illustrate the effectiveness of the proposed results.


2013 ◽  
Vol 760-762 ◽  
pp. 1742-1747
Author(s):  
Jin Fang Han

This paper is concerned with the mean-square exponential stability analysis problem for a class of stochastic interval cellular neural networks with time-varying delay. By using the stochastic analysis approach, employing Lyapunov function and norm inequalities, several mean-square exponential stability criteria are established in terms of the formula and Razumikhin theorem to guarantee the stochastic interval delayed cellular neural networks to be mean-square exponential stable. Some recent results reported in the literatures are generalized. A kind of equivalent description for this stochastic interval cellular neural networks with time-varying delay is also given.


2012 ◽  
Vol 2012 ◽  
pp. 1-20
Author(s):  
Ting Wang ◽  
Tao Li ◽  
Mingxiang Xue ◽  
Shumin Fei

Together with the Lyapunov-Krasovskii functional approach and an improved delay-partitioning idea, one novel sufficient condition is derived to guarantee a class of delayed neural networks to be asymptotically stable in the mean-square sense, in which the probabilistic variable delay and both of delay variation limits can be measured. Through combining the reciprocal convex technique and convex technique one, the criterion is presented via LMIs and its solvability heavily depends on the sizes of both time-delay range and its variations, which can become much less conservative than those present ones by thinning the delay intervals. Finally, it can be demonstrated by four numerical examples that our idea reduces the conservatism more effectively than some earlier reported ones.


Author(s):  
Marina Ermolickaya

Using the RStudio program, a neural network model has been developed that predicts positive dynamics in the treatment of tuberculosis patients in a tuberculosis dispensary hospital. The accuracy of the presented model on the test sample is 99.4%, the mean square error (MSE) is 0.013.


2013 ◽  
Vol 2013 ◽  
pp. 1-9 ◽  
Author(s):  
Hanfeng Kuang ◽  
Jinbo Liu ◽  
Xi Chen ◽  
Jie Mao ◽  
Linjie He

The asymptotic behavior of a class of switched stochastic cellular neural networks (CNNs) with mixed delays (discrete time-varying delays and distributed time-varying delays) is investigated in this paper. Employing the average dwell time approach (ADT), stochastic analysis technology, and linear matrix inequalities technique (LMI), some novel sufficient conditions on the issue of asymptotic behavior (the mean-square ultimate boundedness, the existence of an attractor, and the mean-square exponential stability) are established. A numerical example is provided to illustrate the effectiveness of the proposed results.


Complexity ◽  
2019 ◽  
Vol 2019 ◽  
pp. 1-20 ◽  
Author(s):  
Xiaohui Xu ◽  
Jibin Yang ◽  
Yanhai Xu

This paper investigates the mean square exponential stability problem of a class of complex-valued neural networks with stochastic disturbance and mixed delays including both time-varying delays and continuously distributed delays. Under different assumption conditions concerning stochastic disturbance term from the existing ones, some sufficient conditions are derived for assuring the mean square exponential stability of the equilibrium point of the system based on the vector Lyapunov function method and Ito^ differential-integral theorem. The obtained results not only generalize the existing ones, but also reduce the conservatism of the previous stability results about complex-valued neural networks with stochastic disturbances. Two numerical examples with simulation results are given to verify the feasibility of the proposed results.


Sign in / Sign up

Export Citation Format

Share Document