Share:


Two modified hybrid conjugate gradient methods based on a hybrid secant equation

    Saman Babaie-Kafaki Affiliation
    ; Nezam Mahdavi-Amiri Affiliation

Abstract

Taking advantage of the attractive features of Hestenes–Stiefel and Dai–Yuan conjugate gradient methods, we suggest two globally convergent hybridizations of these methods following Andrei's approach of hybridizing the conjugate gradient parameters convexly and Powell's approach of nonnegative restriction of the conjugate gradient parameters. In our methods, the hybridization parameter is obtained based on a recently proposed hybrid secant equation. Numerical results demonstrating the efficiency of the proposed methods are reported.

Keyword : unconstrained optimization, large-scale optimization, conjugate gradient method, secant equation, global convergence

How to Cite
Babaie-Kafaki, S., & Mahdavi-Amiri, N. (2013). Two modified hybrid conjugate gradient methods based on a hybrid secant equation. Mathematical Modelling and Analysis, 18(1), 32-52. https://doi.org/10.3846/13926292.2013.756832
Published in Issue
Feb 1, 2013
Abstract Views
527
PDF Downloads
393
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.