scholarly journals A normalized gradient descent algorithm for nonlinear adaptive filters using a gradient adaptive step size

2001 ◽  
Vol 8 (11) ◽  
pp. 295-297 ◽  
Author(s):  
D.P. Mandic ◽  
A.I. Hanna ◽  
M. Razaz
2012 ◽  
Vol 2012 ◽  
pp. 1-13
Author(s):  
Hong Chen ◽  
Fangchao He ◽  
Zhibin Pan

We introduce a gradient descent algorithm for bipartite ranking with general convex losses. The implementation of this algorithm is simple, and its generalization performance is investigated. Explicit learning rates are presented in terms of the suitable choices of the regularization parameter and the step size. The result fills the theoretical gap in learning rates for ranking problem with general convex losses.


2016 ◽  
Vol 26 (04) ◽  
pp. 1650056
Author(s):  
Auni Aslah Mat Daud

In this paper, we present the application of the gradient descent of indeterminism (GDI) shadowing filter to a chaotic system, that is the ski-slope model. The paper focuses on the quality of the estimated states and their usability for forecasting. One main problem is that the existing GDI shadowing filter fails to provide stability to the convergence of the root mean square error and the last point error of the ski-slope model. Furthermore, there are unexpected cases in which the better state estimates give worse forecasts than the worse state estimates. We investigate these unexpected cases in particular and show how the presence of the humps contributes to them. However, the results show that the GDI shadowing filter can successfully be applied to the ski-slope model with only slight modification, that is, by introducing the adaptive step-size to ensure the convergence of indeterminism. We investigate its advantages over fixed step-size and how it can improve the performance of our shadowing filter.


Sign in / Sign up

Export Citation Format

Share Document