Library

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 89 (2000), S. 149-185 
    ISSN: 1436-4646
    Keywords: Key words: constrained optimization – interior point method – large-scale optimization – nonlinear programming – primal method – primal-dual method – SQP iteration – barrier method – trust region method Mathematics Subject Classification (1991): 20E28, 20G40, 20C20
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract. An algorithm for minimizing a nonlinear function subject to nonlinear inequality constraints is described. It applies sequential quadratic programming techniques to a sequence of barrier problems, and uses trust regions to ensure the robustness of the iteration and to allow the direct use of second order derivatives. This framework permits primal and primal-dual steps, but the paper focuses on the primal version of the new algorithm. An analysis of the convergence properties of this method is presented.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 49 (1990), S. 285-323 
    ISSN: 1436-4646
    Keywords: Constrained optimization ; reduced Hessian methods ; quasi-Newton methods ; successive quadratic programming ; nonlinear programming
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract We study the convergence properties of reduced Hessian successive quadratic programming for equality constrained optimization. The method uses a backtracking line search, and updates an approximation to the reduced Hessian of the Lagrangian by means of the BFGS formula. Two merit functions are considered for the line search: theℓ 1 function and the Fletcher exact penalty function. We give conditions under which local and superlinear convergence is obtained, and also prove a global convergence result. The analysis allows the initial reduced Hessian approximation to be any positive definite matrix, and does not assume that the iterates converge, or that the matrices are bounded. The effects of a second order correction step, a watchdog procedure and of the choice of null space basis are considered. This work can be seen as an extension to reduced Hessian methods of the well known results of Powell (1976) for unconstrained optimization.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 61 (1993), S. 19-37 
    ISSN: 1436-4646
    Keywords: Self-scaling ; BFGS method ; quasi-Newton method ; optimization
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract We study the self-scaling BFGS method of Oren and Luenberger (1974) for solving unconstrained optimization problems. For general convex functions, we prove that the method is globally convergent with inexact line searches. We also show that the directions generated by the self-scaling BFGS method approach Newton's direction asymptotically. This would ensure superlinear convergence if, in addition, the search directions were well-scaled, but we show that this is not always the case. We find that the method has a major drawback: to achieve superlinear convergence it may be necessary to evaluate the function twice per iteration, even very near the solution. An example is constructed to show that the step-sizes required to achieve a superlinear rate converge to 2 and 0.5 alternately.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 45 (1989), S. 503-528 
    ISSN: 1436-4646
    Keywords: Large scale nonlinear optimization ; limited memory methods ; partitioned quasi-Newton method ; conjugate gradient method
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract We study the numerical performance of a limited memory quasi-Newton method for large scale optimization, which we call the L-BFGS method. We compare its performance with that of the method developed by Buckley and LeNir (1985), which combines cycles of BFGS steps and conjugate direction steps. Our numerical tests indicate that the L-BFGS method is faster than the method of Buckley and LeNir, and is better able to use additional storage to accelerate convergence. We show that the L-BFGS method can be greatly accelerated by means of a simple scaling. We then compare the L-BFGS method with the partitioned quasi-Newton method of Griewank and Toint (1982a). The results show that, for some problems, the partitioned quasi-Newton method is clearly superior to the L-BFGS method. However we find that for other problems the L-BFGS method is very competitive due to its low iteration cost. We also study the convergence properties of the L-BFGS method, and prove global convergence on uniformly convex problems.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 5
    Electronic Resource
    Electronic Resource
    Springer
    Computing 22 (1979), S. 93-100 
    ISSN: 1436-5057
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Description / Table of Contents: Zusammenfassung Kürzlich wurde von Davidon für Optimierungsprobleme ein neuer Weg vorgeschlagen, bei dem die Idee der nichtlinearen Skalierung verwendet wird. Der Algorithmus wird in der vorliegenden Arbeit analysiert für den eindimensionalen Fall. Es wird gezeigt, daß der Algorithmus lokal konvergiert mit quadratischerQ-Konvergenz und die Konvergenzeigenschaften werden mit denjenigen der Methode der kubischen Interpolation verglichen.
    Notes: Abstract Davidon has recently introduced a new approach to optimization using the idea of nonlinear scaling. In this paper we study the algorithm that results when applying his ideas to the one-dimensional case. We show that the algorithm is locally convergent withQ-order equal 2 and compare it with the method of cubic interpolation.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 6
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 23 (1982), S. 326-340 
    ISSN: 1436-4646
    Keywords: Optimization ; Quasi-Newton ; Conjugate Gradient
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract In this paper we study conjugate gradient algorithms for large optimization problems. These methods accelerate (or precondition) the conjugate gradient method by means of quasi-Newton matrices, and are designed to utilize a variable amount of storage, depending on how much information is retained in the quasi-Newton matrices. We are concerned with the behaviour of such methods on the underlying quadratic model, and in particular, with finite termination properties.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 7
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 63 (1994), S. 129-156 
    ISSN: 1436-4646
    Keywords: Quasi-Newton method ; constrained optimization ; limited memory method ; large-scale optimization
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract We derive compact representations of BFGS and symmetric rank-one matrices for optimization. These representations allow us to efficiently implement limited memory methods for large constrained optimization problems. In particular, we discuss how to compute projections of limited memory matrices onto subspaces. We also present a compact representation of the matrices generated by Broyden's update for solving systems of nonlinear equations.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 8
    Electronic Resource
    Electronic Resource
    Springer
    Computational optimization and applications 15 (2000), S. 45-67 
    ISSN: 1573-2894
    Keywords: Successive Quadratic Programming ; reduced Hessian methods ; constrained optimization ; quasi-Newton method ; large-scale optimization
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Notes: Abstract The reduced Hessian SQP algorithm presented in Biegler et al. [SIAM J. Optimization, Vol. 5, no. 2, pp. 314–347, 1995.] is developed in this paper into a practical method for large-scale optimization. The novelty of the algorithm lies in the incorporation of a correction vector that approximates the cross term ZTWYpY. This improves the stability and robustness of the algorithm without increasing its computational cost. The paper studies how to implement the algorithm efficiently, and presents a set of tests illustrating its numerical performance. An analytic example, showing the benefits of the correction term, is also presented.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 9
    Book
    Book
    Berlin u.a. :Springer,
    Title: Numerical optimization
    Author: Nocedal, Jorge
    Contributer: Wright, Stephen J.
    Publisher: Berlin u.a. :Springer,
    Year of publication: 1999
    Pages: XX, 636 S.
    Series Statement: Springer series in operations research
    Type of Medium: Book
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 10
    Book
    Book
    New York [u.a.] :Springer,
    Title: Numerical optimization /
    Author: Nocedal, Jorge
    Contributer: Wright, Stephen J.
    Edition: 2. ed.
    Publisher: New York [u.a.] :Springer,
    Year of publication: 2006
    Pages: XX, 636 S. : , graph. Darst.
    Series Statement: Springer series in operation research and financial engineering
    ISBN: 978-0387-30303-1 , 0-387-30303-0
    Type of Medium: Book
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...