A Comparative Study of Dynamic Learning Rate BPN and Wavelet Neural Networks

Author(s):  
Y. Z. Zhao ◽  
J. B. Zhang ◽  
A.J.R. Aendenroomer
Author(s):  
MOHAMED ZINE EL ABIDINE SKHIRI ◽  
MOHAMED CHTOUROU

This paper investigates the applicability of the constructive approach proposed in Ref. 1 to wavelet neural networks (WNN). In fact, two incremental training algorithms will be presented. The first one, known as one pattern at a time (OPAT) approach, is the WNN version of the method applied in Ref. 1. The second approach however proposes a modified version of Ref. 1, known as one epoch at a time (OEAT) approach. In the OPAT approach, the input patterns are trained incrementally one by one until all patterns are presented. If the algorithm gets stuck in a local minimum and could not escape after a fixed number of successive attempts, then a new wavelet called also wavelon, will be recruited. In the OEAT approach however, all the input patterns are presented one epoch at a time. During one epoch, each pattern is trained only once until all patterns are trained. If the resulting overall error is reduced, then all the patterns will be retrained for one more epoch. Otherwise, a new wavelon will be recruited. To guarantee the convergence of the trained networks, an adaptive learning rate has been introduced using the discrete Lyapunov stability theorem.


Author(s):  
Rodrigo Aldana ◽  
Leobardo Campos-Macias ◽  
Julio Zamora ◽  
David Gomez-Gutierrez ◽  
Adan Cruz

Sign in / Sign up

Export Citation Format

Share Document