Convergence properties of simulated annealing for continuous global optimization

1996 ◽  
Vol 33 (4) ◽  
pp. 1127-1140 ◽  
Author(s):  
M. Locatelli

In this paper conditions for the convergence of a class of simulated annealing algorithms for continuous global optimization are given. The previous literature about the subject gives results for the convergence of algorithms in which the next candidate point is generated according to a probability distribution whose support is the whole feasible set. A class of possible cooling schedules has been introduced in order to remove this restriction.

1996 ◽  
Vol 33 (04) ◽  
pp. 1127-1140 ◽  
Author(s):  
M. Locatelli

In this paper conditions for the convergence of a class of simulated annealing algorithms for continuous global optimization are given. The previous literature about the subject gives results for the convergence of algorithms in which the next candidate point is generated according to a probability distribution whose support is the whole feasible set. A class of possible cooling schedules has been introduced in order to remove this restriction.


2000 ◽  
Vol 32 (2) ◽  
pp. 480-498 ◽  
Author(s):  
G. Yin

This work develops a class of stochastic global optimization algorithms that are Kiefer-Wolfowitz (KW) type procedures with an added perturbing noise and partial step size restarting. The motivation stems from the use of KW-type procedures and Monte Carlo versions of simulated annealing algorithms in a wide range of applications. Using weak convergence approaches, our effort is directed to proving the convergence of the underlying algorithms under general noise processes.


2000 ◽  
Vol 09 (01) ◽  
pp. 3-25 ◽  
Author(s):  
BENJAMIN W. WAH ◽  
TAO WANG

This paper studies various strategies in constrained simulated annealing (CSA), a global optimization algorithm that achieves asymptotic convergence to constrained global minima (CGM) with probability one for solving discrete constrained nonlinear programming problems (NLPs). The algorithm is based on the necessary and sufficient condition for discrete constrained local minima (CLM) in the theory of discrete Lagrange multipliers and its extensions to continuous and mixed-integer constrained NLPs. The strategies studied include adaptive neighborhoods, distributions to control sampling, acceptance probabilities, and cooling schedules. We report much better solutions than the best-known solutions in the literature on two sets of continuous benchmarks and their discretized versions.


Sign in / Sign up

Export Citation Format

Share Document