Electronic Resource
Springer
Journal of optimization theory and applications
77 (1993), S. 197-208
ISSN:
1573-2878
Keywords:
Nonlinear optimization
;
convergence of ascent methods
Source:
Springer Online Journal Archives 1860-2000
Topics:
Mathematics
Notes:
Abstract We study the projected gradient algorithm for linearly constrained optimization. Wolfe (Ref. 1) has produced a counterexample to show that this algorithm can jam. However, his counterexample is only ℒ1(ℝ n ), and it is conjectured that the algorithm is convergent for ℒ2-functions. We show that this conjecture is partly right. We also show that one needs more assumptions to prove convergence, since we present a family of counterexamples. We finally give a demonstration that no jamming can occur for quadratic objective functions.
Type of Medium:
Electronic Resource
URL:
http://dx.doi.org/10.1007/BF00940786
Permalink
Library |
Location |
Call Number |
Volume/Issue/Year |
Availability |