There exist large varieties of conjugate gradient algorithms. In order to take advantage of the attractive features of Liu and Storey (LS) and Conjugate Descent (CD) conjugate gradient methods, we suggest hybridization of these methods in which the parameter is computed More
There exist large varieties of conjugate gradient algorithms. In order to take advantage of the attractive features of Liu and Storey (LS) and Conjugate Descent (CD) conjugate gradient methods, we suggest hybridization of these methods in which the parameter is computed as a convex combination of and respectively which the conjugate gradient (update) parameter was obtained from Secant equation. The algorithm generates descent direction and when the iterate jam, the direction satisfy sufficient descent condition. We report numerical results demonstrating the efficiency of our method. The hybrid computational scheme outperform or comparable with known conjugate gradient algorithms. We also show that our method converge globally using strong Wolfe condition.
Manuscript profile