Use of periodic and monotonic activation functions in multilayer feedforward neural networks trained by extended Kalman filter algorithm

2002 ◽  
Vol 149 (4) ◽  
pp. 217 ◽  
Author(s):  
K.-W. Wong ◽  
C.-S. Leung ◽  
S.-J. Chang
2007 ◽  
Vol 19 (4) ◽  
pp. 1039-1055 ◽  
Author(s):  
Su Lee Goh ◽  
Danilo P. Mandic

An augmented complex-valued extended Kalman filter (ACEKF) algorithm for the class of nonlinear adaptive filters realized as fully connected recurrent neural networks is introduced. This is achieved based on some recent developments in the so-called augmented complex statistics and the use of general fully complex nonlinear activation functions within the neurons. This makes the ACEKF suitable for processing general complex-valued nonlinear and nonstationary signals and also bivariate signals with strong component correlations. Simulations on benchmark and real-world complex-valued signals support the approach.


1994 ◽  
Vol 03 (03) ◽  
pp. 339-348
Author(s):  
CARL G. LOONEY

We review methods and techniques for training feedforward neural networks that avoid problematic behavior, accelerate the convergence, and verify the training. Adaptive step gain, bipolar activation functions, and conjugate gradients are powerful stabilizers. Random search techniques circumvent the local minimum trap and avoid specialization due to overtraining. Testing assures quality learning.


2018 ◽  
Vol 273 ◽  
pp. 230-236 ◽  
Author(s):  
Yurong Li ◽  
Jun Chen ◽  
Li Jiang ◽  
Nianyin Zeng ◽  
Haiyan Jiang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document