Vous êtes sur la page 1sur 13

1

SIMULATED ANNEALING
ARAVIND SUDHEESAN
ROLL NO: 1

BIJU B
ROLL.NO :2

ANNEALING MEANING IN
METALLURGY
At high temp. movt. of atoms in increased (in molten state)& vice- versa
At low temp, the atoms gets ordered and crystals develop with min.
energy
If temp is reduced at a faster rate crystalline state does not occur- but
poly crystalline state occurs (higher energy state)
If temp is reduced at a lower rate then absolute min energy state occurs

SIMULATED ANNEALING
Resembles the cooling process of molten metal through annealing
The cooling phenomenon is simulated by controlling the temp parameter
in the Boltzmann probability distribution
A system in thermal equilibrium T, the probabilistic energy distribution
can be represented as, P(E)= e(-E/kT) where, k is Boltzmann const.
A system at high temp has uniform probability of being at any energy state.
A system at low temp has small probability of being at higher energy state

Convergence of the algorithm can be controlled by controlling the T and


assuming that the search process follows the Boltzmann probability
distribution

METROPOLIS METHOD
Metropolis introduced a method to implement Boltzmann distribution.
Consider an instant the current point is at x(t) and the fn. Value at that
point is E(t)= f(x(t))
Using metropolis algorithm we can say that the probability of next point
being at x(t+1)depends on the difference the fn. Values of the two points
E = E(t+1) E(t)
And is calculated using Boltzmann probability distribution
P(E(t+1)) = min [1 , e(-E/kT) ]

If E0, the probability is 1 and point x(t+1)is accepted


If E>0, the point x(t+1)is worse than x(t)
The point x(t+1)is chosen in a special method
A random number r, is assumed in the range (0-1).
If r e(-E/kT) set t=t+1, it is accepted. If not the point (t+1) is rejected and
we assume a new point for analysis.

ALGORITHM
1. Choose an initial point x0 and a termination criteria . Set T a
sufficiently high value, numb of iterations be n, and set t=0
2. Assume the neighborhood points xt+1 = N(xt) usually a random point at
the neighborhood is selected
3. If E = E(xt+1 ) E(xt) < 0, set t = t+1;
Else create a random number r in the range (0,1). If r e(-E/kT) set t=
t+1.
Else go to step 2
4. If| xt+1 xt|< and T is small, Terminate;
Else, lower T according to a cooling schedule. Go to step 2;
Else go to step 2

EXAMPLE 1

Minimize the following problem using simulated annealing method


minimize f(x1, x2) = (x12+ x2 -11)2 + (x1+x22 - 7)2
Step 1- Iteration 1
Choose an initial point x0 = (2.5,2.5)T and a termination factor =10-3
To find the initial temp T, we find the average of the function values at points
(0,0), (0,5), (5,0), (5,5). So T = 405 . Set the initial iteration counter to t = 0

Step 2
We create a point in the neighborhood of x0
We assume the neighborhood values are x1= 0.037 & x2 = - 0.086
The new x1 = (2.537, 2.414)T With a fn value f(x1) = 6.482. Initially f(x0) =
8.125

Step 3
Now E = f(x1 ) f(x0) = -1.643.
Since E < 0 we accept the new point.
We increment the counter t = 1 and proceed to step 4

Step 4

Since x0 & x1 are not close enough for termination we cannot terminate.
One iteration is complete here
To limit the numb of iterations we reduce the T by half
So new T = 0.5 x 405 = 202.5

Step 2 - Iteration 2
Now a pair of new points are created in the neighborhood of x 1
We assume the neighborhood values are x1= - 0.426 & x2 = - 1.810
Now new x2 = (2.072, 0.604)T and f(x2) = 58.067

Step 3
Now E = f(x2 ) f(x1) = 58.067-6.482 = 51.585
Since the quantity is positive we use Metropolis algorithm to decide whether to
accept or to reject
Assume a random numb r = 0.649
Probability to accept the new point is e (-51.585/202.5) = 0.775
Since r < 0.775 point is accepted
We set t = 2 and proceed

Step 4
The termination criteria not met
Iteration compete
So T = 0.5 x 202.5 = 101.25

Step 2 Iteration 3
Next point found at the vicinity of the current point
Here x1= -0.103 & x2 = - 2.812 ; x3 = (2.397, - 0.312)T and f(x3) = 51.287

10

Step 3
Now E = f(x3 ) f(x2) = 51.287 58.067 = - 6.780
It is ve and we accept and t = 3

Step 4
Termination criteria not satisfied
T = 0.5 x 101.25 = 50.625

Step 2 Iteration 4
Here x1= - 1.103 & x2 = - 0.779
Similar to above x4 = (1.397, 1.721)T and f(x4) = 60.666

Step 3
E = f(x4 ) f(x3) = 9.379
Since the quantity is positive. We use Metropolis algorithm to decide whether to accept or to
reject
Assume a random numb r = 0.746 should be under the range 0 1
Probability to accept the new point is e(-9.379/50.625) = 0.831
Since r < 0.831 we accept this point and t = 4

Step 4
Termination criteria not satisfied

11

Now T = 25. 313. Iteration is complete

Step 2 Iteration 5
Here x1= - 1.707 & x2 = - 0.550
Here x5 = (0.793, 1.950)T and f(x5) = 76.697

Step 3

Here E = f(x5 ) f(x4) = 16.031


Since the quantity is positive. We use metropolis algorithm
r = 0.793
e(-16.031/25.313) = 0.531. Here r > 0.531 We do not accept the point
Now we need to assume a new point.

Step 2 - Iteration 5
Here x1= - 0.809 & x2 = - 0.411
Here x6 = (1.691, 2.089)T and f(x6) = 37.514

Step 3

12

Since E is ve point is selected and t = 5

Step 4
T = 12.656

The quality of the final solution is not affected by the initial guess, except
thar the computational effort just increases
This process continues until the T reduces to a small value
In the early stages in SA any point is equally likely to be accepted
So search space is well investigated before convergence to a optimum
solution
For sufficiently large iterations at each temp and with a small cooling
rate, the algorithm guarantees convergence to the globally optimal
solution

13

Thank You

Vous aimerez peut-être aussi