Authors: John A Ford Yasushi Narushima Hiroshi Yabe
Publish Date: 2007/11/09
Volume: 40, Issue: 2, Pages: 191-216
Abstract
Conjugate gradient methods are appealing for large scale nonlinear optimization problems because they avoid the storage of matrices Recently seeking fast convergence of these methods Dai and Liao Appl Math Optim 4387–101 2001 proposed a conjugate gradient method based on the secant condition of quasiNewton methods and later Yabe and Takano Comput Optim Appl 28203–225 2004 proposed another conjugate gradient method based on the modified secant condition In this paper we make use of a multistep secant condition given by Ford and Moghrabi Optim Methods Softw 2357–370 1993 J Comput Appl Math 50305–323 1994 and propose two new conjugate gradient methods based on this condition The methods are shown to be globally convergent under certain assumptions Numerical results are reported
Keywords: