ISSN:
1573-2878
Source:
Springer Online Journal Archives 1860-2000
Topics:
Mathematics
Notes:
Abstract A new accelerated gradient method for finding the minimum of a functionf(x) whose variables are unconstrained is investigated. The new algorithm can be stated as follows: $$\tilde x = x + \delta x,\delta x = - \alpha g(x) + \beta \delta \hat x$$ where δx is the change in the position vectorx, g(x) is the gradient of the functionf(x), and α and β are scalars chosen at each step so as to yield the greatest decrease in the function. The symbol $$\delta \hat x$$ denotes the change in the position vector for the iteration preceding that under consideration. For a nonquadratic function, initial convergence of the present method is faster than that of the Fletcher-Reeves method because of the extra degree of freedom available. For a test problem, the number of iterations was about 40–50% that of the Fletcher-Reeves method and the computing time about 60–75% that of the Fletcher-Reeves method, using comparable search techniques.
Type of Medium:
Electronic Resource
URL:
http://dx.doi.org/10.1007/BF00929359
Permalink