Electronic Resource
Springer
Journal of optimization theory and applications
95 (1997), S. 177-188
ISSN:
1573-2878
Keywords:
Unconstrained differentiable minimization
;
descent methods
;
global convergence
;
rate of convergence
Source:
Springer Online Journal Archives 1860-2000
Topics:
Mathematics
Notes:
Abstract In this paper, we discuss the convergence properties of a class of descent algorithms for minimizing a continuously differentiable function f on R n without assuming that the sequence { x k } of iterates is bounded. Under mild conditions, we prove that the limit infimum of $$\left\| { \nabla f(x_k )} \right\|$$ is zero and that false convergence does not occur when f is convex. Furthermore, we discuss the convergence rate of { $$\left\| { x_k } \right\|$$ } and { f(x k )} when { x k } is unbounded and { f(x k )} is bounded.
Type of Medium:
Electronic Resource
URL:
http://dx.doi.org/10.1023/A:1022691513687
Permalink
Library |
Location |
Call Number |
Volume/Issue/Year |
Availability |