A New Hybrid Conjugate Gradient Method Based on Secant Equation for Solving Large Scale Unconstrained Optimization Problems
محورهای موضوعی : Operation ResearchNasiru Salihu 1 , Mathew Odekunle 2 , Mohammed Waziri 3 , Abubakar Halilu 4
1 - Department of Mathematics, School of Physical Science, Moddibo Adama University of Technology, Yola.
2 - Department of Mathematics, School of Physical Sciences,
Modibbo Adama University of Technology, Yola, Nigeria.
3 - Department of Mathematical Sciences, Faculty of Sciences,
Bayero University, Kano, Nigeria.
4 - Department of Mathematics and Computer Science,
Sule Lamido University, Ka n Hausa, Nigeria.
کلید واژه: Unconstrained optimization, Global convergence, conjugate gradient algorithm, large scale optimization problem, secant equation,
چکیده مقاله :
There exist large varieties of conjugate gradient algorithms. In order to take advantage of the attractive features of Liu and Storey (LS) and Conjugate Descent (CD) conjugate gradient methods, we suggest hybridization of these methods in which the parameter is computed as a convex combination of and respectively which the conjugate gradient (update) parameter was obtained from Secant equation. The algorithm generates descent direction and when the iterate jam, the direction satisfy sufficient descent condition. We report numerical results demonstrating the efficiency of our method. The hybrid computational scheme outperform or comparable with known conjugate gradient algorithms. We also show that our method converge globally using strong Wolfe condition.
There exist large varieties of conjugate gradient algorithms. In order to take advantage of the attractive features of Liu and Storey (LS) and Conjugate Descent (CD) conjugate gradient methods, we suggest hybridization of these methods in which the parameter is computed as a convex combination of and respectively which the conjugate gradient (update) parameter was obtained from Secant equation. The algorithm generates descent direction and when the iterate jam, the direction satisfy sufficient descent condition. We report numerical results demonstrating the efficiency of our method. The hybrid computational scheme outperform or comparable with known conjugate gradient algorithms. We also show that our method converge globally using strong Wolfe condition.