Exploiting the connection between PLS, Lanczos methods and conjugate gradients: alternative proofs of some properties of PLS

2002 ◽  
Vol 16 (7) ◽  
pp. 361-367 ◽  
Author(s):  
Aloke Phatak ◽  
Frank de Hoog
Mathematics ◽  
2021 ◽  
Vol 9 (13) ◽  
pp. 1522
Author(s):  
Anna Concas ◽  
Lothar Reichel ◽  
Giuseppe Rodriguez ◽  
Yunzi Zhang

The power method is commonly applied to compute the Perron vector of large adjacency matrices. Blondel et al. [SIAM Rev. 46, 2004] investigated its performance when the adjacency matrix has multiple eigenvalues of the same magnitude. It is well known that the Lanczos method typically requires fewer iterations than the power method to determine eigenvectors with the desired accuracy. However, the Lanczos method demands more computer storage, which may make it impractical to apply to very large problems. The present paper adapts the analysis by Blondel et al. to the Lanczos and restarted Lanczos methods. The restarted methods are found to yield fast convergence and to require less computer storage than the Lanczos method. Computed examples illustrate the theory presented. Applications of the Arnoldi method are also discussed.


2006 ◽  
Vol 23 (4) ◽  
pp. 273-284
Author(s):  
A. M. Vidal ◽  
A. Vidal ◽  
V. E. Boria ◽  
V. M. García

1994 ◽  
Vol 03 (03) ◽  
pp. 339-348
Author(s):  
CARL G. LOONEY

We review methods and techniques for training feedforward neural networks that avoid problematic behavior, accelerate the convergence, and verify the training. Adaptive step gain, bipolar activation functions, and conjugate gradients are powerful stabilizers. Random search techniques circumvent the local minimum trap and avoid specialization due to overtraining. Testing assures quality learning.


1998 ◽  
Vol 81 (1) ◽  
pp. 125-141 ◽  
Author(s):  
V. Simoncini

2017 ◽  
Vol 2017 ◽  
pp. 1-9 ◽  
Author(s):  
Xiaowei Fang ◽  
Qin Ni

In this paper, we propose a new hybrid direct search method where a frame-based PRP conjugate gradients direct search algorithm is combined with radial basis function interpolation model. In addition, the rotational minimal positive basis is used to reduce the computation work at each iteration. Numerical results for solving the CUTEr test problems show that the proposed method is promising.


Sign in / Sign up

Export Citation Format

Share Document