Limited memory quasi-newton method for large-scale linearly equality-constrained minimization

2000 ◽  
Vol 16 (3) ◽  
pp. 320-328 ◽  
Author(s):  
Ni Qin
2020 ◽  
Vol 31 (11) ◽  
pp. 4776-4790 ◽  
Author(s):  
Huiming Chen ◽  
Ho-Chun Wu ◽  
Shing-Chow Chan ◽  
Wong-Hing Lam

2013 ◽  
Vol 2013 ◽  
pp. 1-6 ◽  
Author(s):  
Yang Weiwei ◽  
Yang Yueting ◽  
Zhang Chenhui ◽  
Cao Mingyuan

We present a new Newton-like method for large-scale unconstrained nonconvex minimization. And a new straightforward limited memory quasi-Newton updating based on the modified quasi-Newton equation is deduced to construct the trust region subproblem, in which the information of both the function value and gradient is used to construct approximate Hessian. The global convergence of the algorithm is proved. Numerical results indicate that the proposed method is competitive and efficient on some classical large-scale nonconvex test problems.


2016 ◽  
Vol 26 (2) ◽  
pp. 1008-1031 ◽  
Author(s):  
R. H. Byrd ◽  
S. L. Hansen ◽  
Jorge Nocedal ◽  
Y. Singer

Symmetry ◽  
2021 ◽  
Vol 13 (11) ◽  
pp. 2093
Author(s):  
Huiping Cao ◽  
Xiaomin An

In our paper, we introduce a sparse and symmetric matrix completion quasi-Newton model using automatic differentiation, for solving unconstrained optimization problems where the sparse structure of the Hessian is available. The proposed method is a kind of matrix completion quasi-Newton method and has some nice properties. Moreover, the presented method keeps the sparsity of the Hessian exactly and satisfies the quasi-Newton equation approximately. Under the usual assumptions, local and superlinear convergence are established. We tested the performance of the method, showing that the new method is effective and superior to matrix completion quasi-Newton updating with the Broyden–Fletcher–Goldfarb–Shanno (BFGS) method and the limited-memory BFGS method.


Sign in / Sign up

Export Citation Format

Share Document