Library

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 22 (1977), S. 487-508 
    ISSN: 1573-2878
    Keywords: Derivative-free algorithms ; Q-superlinear convergence ; function minimization ; mathematical programming ; numerical methods
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract A convergence analysis is presented for a general class of derivative-free algorithms for minimizing a functionf(x) for which the analytic form of the gradient and the Hessian is impractical to obtain. The class of algorithms accepts finite-difference approximation to the gradient, with stepsizes chosen in such a way that the length of the stepsize must meet two conditions involving the previous stepsize and the distance from the last estimate of the solution to the current estimate. The algorithms also maintain an approximation to the second-derivative matrix and require that the change inx made at each iteration be subject to a bound that is also revised automatically. The convergence theorems have the features that the starting pointx 1 need not be close to the true solution andf(x) need not be convex. Furthermore, despite the fact that the second-derivative approximation may not converge to the true Hessian at the solution, the rate of convergence is still Q-superlinear. The theorry is also shown to be applicable to a modification of Powell's dog-leg algorithm.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...