Library

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Numerische Mathematik 51 (1987), S. 395-414 
    ISSN: 0945-3245
    Keywords: AMS(MOS): Primary: 65KO5 ; Secondary: 90C25
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Summary This paper presents a readily implementable algorithm for solving constrained minimization problems involving (possibly nonsmooth) convex functions. The constraints are handled as in the successive quadratic approximations methods for smooth problems. An exact penalty function is employed for stepsize selection. A scheme for automatic limitation of penalty growth is given. Global convergence of the algorithm is established, as well as finite termination for piecewise linear problems. Numerical experience is reported.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    Electronic Resource
    Electronic Resource
    Springer
    Numerische Mathematik 45 (1984), S. 411-430 
    ISSN: 0945-3245
    Keywords: AMS MOS ; 90C30 ; 65K05 ; CR: 5.15
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Summary An unconstrained nonlinear programming problem with nondifferentiabilities is considered. The nondifferentiabilities arise from terms of the form max [f 1(x), ...,f n (x)], which may enter nonlinearly in the objective function. Local convex polyhedral upper approximations to the objective function are introduced. These approximations are used in an iterative method for solving the problem. The algorithm proceeds by solving quadratic programming subproblems to generate search directions. Approximate line searches ensure global convergence of the method to stationary points. The algorithm is conceptually simple and easy to implement. It generalizes efficient variable metric methods for minimax calculations.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    Electronic Resource
    Electronic Resource
    Springer
    Numerische Mathematik 68 (1994), S. 325-340 
    ISSN: 0945-3245
    Keywords: Mathematics Subject Classification (1991):65K05
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Summary. A quadratic programming method is given for minimizing a sum of piecewise linear functions and a proximal quadratic term, subject to simple bounds on variables. It may be used for search direction finding in nondifferentiable optimization algorithms. An efficient implementation is described that updates a Cholesky factorization of active constraints and provides good accuracy via iterative refinement. Numerical experience is reported for some large problems.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 69 (1995), S. 89-109 
    ISSN: 1436-4646
    Keywords: Nondifferentiable (nonsmooth) optimization ; Convex programming ; Mathematical programming ; Nonlinear programming ; Saddle-points ; Variational inequalities ; Bundle methods
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract We study proximal level methods for convex optimization that use projections onto successive approximations of level sets of the objective corresponding to estimates of the optimal value. We show that they enjoy almost optimal efficiency estimates. We give extensions for solving convex constrained problems, convex-concave saddle-point problems and variational inequalities with monotone operators. We present several variants, establish their efficiency estimates, and discuss possible implementations. In particular, our methods require bounded storage in contrast to the original level methods of Lemaréchal, Nemirovskii and Nesterov.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 5
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 85 (1999), S. 241-258 
    ISSN: 1436-4646
    Keywords: Key words: convex programming – nondifferentiable optimization – proximal methods – bundle methods – Bregman functions – B-functions Mathematics Subject Classification (1991): 65K05, 90C25
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: k } by taking xk to be an approximate minimizer of , where is a piecewise linear model of f constructed from accumulated subgradient linearizations of f, Dh is the D-function of a generalized Bregman function h and tk〉0. Convergence under implementable criteria is established by extending our recent framework of Bregman proximal minimization, which is of independent interest, e.g., for nonquadratic multiplier methods for constrained minimization. In particular, we provide new insights into the convergence properties of bundle methods based on h=½|·|2.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 6
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 46 (1990), S. 105-122 
    ISSN: 1436-4646
    Keywords: Nondifferentiable minimization ; convex programming ; numerical methods ; descent methods
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract Proximal bundle methods for minimizing a convex functionf generate a sequence {x k } by takingx k+1 to be the minimizer of $$\hat f^k (x) + u^k |x - x^k |^2 /2$$ , where $$\hat f^k $$ is a sufficiently accurate polyhedral approximation tof andu k 〉 0. The usual choice ofu k = 1 may yield very slow convergence. A technique is given for choosing {u k } adaptively that eliminates sensitivity to objective scaling. Some encouraging numerical experience is reported.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 7
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 34 (1986), S. 175-187 
    ISSN: 1436-4646
    Keywords: Primary 65K05 ; Secondary 90C25 ; Nonsmooth Optimization ; Nondifferentiable Programming ; Linearly Constrained Minimization ; Descent Methods
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract A readily implementable algorithm is given for minimizing a (possibly nondifferentiable and nonconvex) locally Lipschitz continuous functionf subject to linear constraints. At each iteration a polyhedral approximation tof is constructed from a few previously computed subgradients and an aggregate subgradient, which accumulates the past subgradient information. This aproximation and the linear constraints generate constraints in the search direction finding subproblem that is a quadratic programming problem. Then a stepsize is found by an approximate line search. All the algorithm's accumulation points are stationary. Moreover, the algorithm converges whenf happens to be convex.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 8
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 85 (1999), S. 207-211 
    ISSN: 1436-4646
    Keywords: Key words: nondifferentiable optimization – subgradient optimization ; Mathematics Subject Classification (1991): 90C25, 90C06, 90C30
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 9
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 52 (1991), S. 285-302 
    ISSN: 1436-4646
    Keywords: Nondifferentiable minimization ; convex programming ; exact penalty functions ; numerical methods ; descent methods ; proximal bundle methods ; Primary 65K05 ; Secondary 90C25
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract This paper presents new versions of proximal bundle methods for solving convex constrained nondifferentiable minimization problems. The methods employϑ 1 orϑ ∞ exact penalty functions with new penalty updates that limit unnecessary penalty growth. In contrast to other methods, some of them are insensitive to problem function scaling. Global convergence of the methods is established, as well as finite termination for polyhedral problems. Some encouraging numerical experience is reported. The ideas presented may also be used in variable metric methods for smooth nonlinear programming.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 10
    Electronic Resource
    Electronic Resource
    Springer
    Applied mathematics & optimization 18 (1988), S. 163-180 
    ISSN: 1432-0606
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract A descent method is given for minimizing a nondifferentiable function which can be locally approximated by pointwise minima of convex functions. At each iterate the algorithm finds several directions by solving several linear or quadratic programming subproblems. These directions are then used in an Armijo-like search for the next iterate. A feasible direction extension to inequality constrained minimization problems is also presented. The algorithms converge to points satisfying necessary optimality conditions which are sharper than the ones involved in convergence results for algorithms based on the Clarke subdifferential.
    Type of Medium: Electronic Resource
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...