A Three-Term Extension of a Descent Conjugate Gradient Method
محورهای موضوعی : Fuzzy Optimization and Modeling Journal
1 - Department of mathematics, Semnan University
کلید واژه: Unconstrained optimization, Conjugate gradient method, Global convergence, sufficient descent condition,
چکیده مقاله :
In an effort to make modification on the classical Hestenes--Stiefel method, Shengwei et al. proposed an efficient conjugate gradient method which possesses the sufficient descent condition when the line search fulfills the strong Wolfe conditions (by restricting the line search parameters). Here, we develop a three--term extension of the method which guarantees the sufficient descent condition independent to the line search. Also, we establish global convergence of the method using convexity assumption. At last, practical merits of the proposed method are investigated by numerical experiments on a set of CUTEr test functions. The results show numerical efficiency of the method.