keywords: Conjugate gradient, Non-linear conjugate gradient methods, optimization functions
The nonlinear conjugate gradient method is an effective iterative scheme that is widely employed for the solution of unconstrained large-scale optimization problems. Key to any conjugate gradient algorithm is the calculation of an ideal step length for which numerous strategies have been postulated. In this work, we assessed and compared the execution of the weak. Wolfe line search technique on nine variants of non-linear conjugate gradient methods by carrying out a numerical test. Experiments revealed Dai-Yuan and Conjugate-Descent nonlinear conjugate gradient methods guaranteed faster convergence.