Paper Search Console

Home Search Page About Contact

Journal Title

Title of Journal: Comput Optim Appl

Search In Journal Title:

Abbravation: Computational Optimization and Applications

Search In Journal Abbravation:

Publisher

Springer US

Search In Publisher:

DOI

10.1016/0022-5193(81)90106-5

Search In DOI:

ISSN

1573-2894

Search In ISSN:
Search In Title Of Papers:

Multistep nonlinear conjugate gradient methods fo

Authors: John A Ford Yasushi Narushima Hiroshi Yabe
Publish Date: 2007/11/09
Volume: 40, Issue: 2, Pages: 191-216
PDF Link

Abstract

Conjugate gradient methods are appealing for large scale nonlinear optimization problems because they avoid the storage of matrices Recently seeking fast convergence of these methods Dai and Liao Appl Math Optim 4387–101 2001 proposed a conjugate gradient method based on the secant condition of quasiNewton methods and later Yabe and Takano Comput Optim Appl 28203–225 2004 proposed another conjugate gradient method based on the modified secant condition In this paper we make use of a multistep secant condition given by Ford and Moghrabi Optim Methods Softw 2357–370 1993 J Comput Appl Math 50305–323 1994 and propose two new conjugate gradient methods based on this condition The methods are shown to be globally convergent under certain assumptions Numerical results are reported


Keywords:

References


.
Search In Abstract Of Papers:
Other Papers In This Journal:


Search Result: