Several Guaranteed Descent Conjugate Gradient Methods for Unconstrained Optimization
Keyword(s):
This paper investigates a general form of guaranteed descent conjugate gradient methods which satisfies the descent conditiongkTdk≤-1-1/4θkgk2 θk>1/4and which is strongly convergent whenever the weak Wolfe line search is fulfilled. Moreover, we present several specific guaranteed descent conjugate gradient methods and give their numerical results for large-scale unconstrained optimization.
2014 ◽
2008 ◽
Vol 23
(2)
◽
pp. 275-293
◽
2014 ◽
Vol 8
◽
pp. 2277-2291
◽
2016 ◽
2021 ◽
Vol 22
(3)
◽
pp. 1643
2021 ◽
Vol 3
(2)
◽
pp. 67-82