ISSN:
1572-9036
Keywords:
60F05
;
60F17
;
62L20
;
Random search methods
;
Markovian sequences
;
speed of convergence
Source:
Springer Online Journal Archives 1860-2000
Topics:
Mathematics
Notes:
Abstract The essence of this article lies in a demonstration of the fact that for some random search methods (r.s.m.) of global optimization, the number of the objective function evaluations required to reach a given accuracy may have very slow (logarithmic) growth to infinity as the accuracy tends to zero. Several inequalities of this kind are derived for some typical Markovian monotone r.s.m. in metric spaces including thed-dimensional Euclidean space ℝ d and its compact subsets. In the compact case, one of the main results may be briefly outlined as a constructive theorem of existence: ifτ ɛ is a first moment of approaching a ‘good’ subset ofɛ-neighbourhood ofx 0=arg maxf by some random search sequence (r.s.s.), then we may choose parameters of this r.s.s. in such a way that Eτ ɛ ⩽ c(f) In2 ɛ. Certainly, some restrictions on metric space and functionf are required.
Type of Medium:
Electronic Resource
URL:
http://dx.doi.org/10.1007/BF00995496
Permalink