A Nonlinear Conjugate Gradient Algorithm under Strong Wolfe-Powell Line Search for Large Scale Unconstrained Optimization Problems
Keywords:
unconstrained optimization, conjugate gradient, inexact line search, sufficient descent conditionAbstract
The conventional conjugate gradient method solves linear and quadratic optimization problems but most real life problems consist of nonquadratic functions of several variables. In this work a nonlinear conjugate gradient algorithm for solving large scale optimization problems is presented. The new algorithm is a modification of the Fletcher-Reeves conjugate gradient method and it is proved to achieve global convergence under the strong Wolfe-Powell inexact line search technique. Computational experiments show that the new algorithm presented performs better than the Fletcher-Reeves exact line search algorithm in solving high dimensional nonlinear optimization problems.