ISSN:
1573-2878
Keywords:
Conjugate-gradient method
;
unconstrained minimization
;
superlinearly convergent algorithms
;
mathematical programming
;
quadratically convergent algorithms
Source:
Springer Online Journal Archives 1860-2000
Topics:
Mathematics
Notes:
Abstract For the problem of minimizing an unconstrained function, the conjugate-gradient method is shown to be convergent. If the function is uniformly strictly convex, the ultimate rate of convergence is shown to ben-step superlinear. If the Hessian matrix is Lipschitz continuous, the rate of convergence is shown to ben-step quadratic. All results are obtained for the reset version of the method and with a relaxed requirement on the solution of the stepsize problem. In addition to obtaining sharper results, the paper differs from previously published ones in the mode of proof which contains as a corollary the proof of finiteness of the conjugate-gradient method when applied to a quadratic problem rather than assuming that result.
Type of Medium:
Electronic Resource
URL:
http://dx.doi.org/10.1007/BF00933041
Permalink