Académique Documents
Professionnel Documents
Culture Documents
Tutorial Session
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
GMM I
mu1 = rand(4, 1)
mu2 = rand(4, 1)
Sigma1 = rand(4)
Sigam2 = rand(4)
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
GMM II
pi2 = rand
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
GMM III
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
GMM IV
I M-Step:
N
1 X
µnew
k = γ(znk )xn
Nk n=1
new N
X 1 X
= γ(znk )(xn − µnew new T
k )(xn − µk )
Nk n=1
k
N
Nk X
πknew = , whereNk = γ(znk )
N n=1
Pseudo Code:
for n = 1 : 1 : N
N1 = N1 + gamma(n, 1)
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
GMM V
N2 = N2 + gamma(n, 2)
end
mu1 new =[]
mu1 new =[]
Sigma1 new =[]
Sigma2 new =[]
pi1 new = N1/N
pi2 new = N2/N
for n = 1 : 1 : n
mu1 new = mu1 new + (1/N1) ∗ gamma(n, 1) ∗ X (n, :)
mu2 new = mu2 new + (1/N2) ∗ gamma(n, 2) ∗ X (n, :)
Sigma1 new = Sigma1 new + (1/N1) ∗ gamma(n, 1)∗
. . . (X (n, :) − mu1 new ) ∗ (X (n, :) − mu1 new )T
Sigma2 new = Sigma2 new + (1/N2) ∗ gamma(n, 2)∗
. . . (X (n, :) − mu2 new ) ∗ (X (n, :) − mu2 new )T
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
GMM VI
end
Save the new values for µk , Σk and πk into the old variables.
I Evaluate the log-likelihood function
N
( N )
X X xn
ln P(x/µ, , π) = ln πk N
n=1
µk , k
k=1
Pseudo Code:
J=0; % J is the log-likelihood
for n = 1 : 1 : N
J = J + log (pi1 ∗ exp(−(X (n, :) − mu1)T ∗ inv (Sigma1) ∗ (X (n, :
) − mu1)/2)
. . .+pi2∗exp(−(X (n, :)−mu2)T ∗inv (Sigma2)∗(X (n, :)−mu2)/2))
end
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
GMM VII
I Repeat the E-Step and M-Step for some iterations and check for
the convergence of log-likelihood. for this use the for loop.
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
HMM I
Model 1 Model 2
p11 = 0.8 p21 = 0.3 p11 = 0.2 p21 = 0.7
h11 = 0.9 h21 = 0.2 h11 = 0.1 h21 = 0.8
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
HMM II
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
ANN I
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
ANN II
2. Create the arrays “TDATA1”, “TDATA2”,“VDATA1” and
“VDATA2”. Save both the “TDATA1” and “TDATA2” into
single array “TDATA”. Similarly both the validation data into
a single array.
ANN III
5. Obtain the hidden layer output and output layer output and
error vectors
for i = 1 : 1 : 10
h(i, :) = b + Tdata1(:, i)0 ∗ w
o(i, :) = a + h(i, :) ∗ v
e(i, :) = t(:, i)0 − o(i, :)
end
6. SSE = sum(sum(e. ∗ e))
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
ANN IV
7. Update the weights using the following:
w11 (t + 1) = w11 (t)+ ηh1 (1 − h1 )i1 [e1 v11 + e2 v12 ]
w12 (t + 1) = w12 (t)+ ηh2 (1 − h2 )i1 [e1 v21 + e2 v22 ]
w13 (t + 1) = w13 (t)+ ηh3 (1 − h3 )i1 [e1 v31 + e2 v32 ]
w21 (t + 1) = w21 (t)+ ηh1 (1 − h1 )i2 [e2 v11 + e2 v12 ]
w22 (t + 1) = w22 (t)+ ηh2 (1 − h2 )i2 [e2 v21 + e2 v22 ]
w23 (t + 1) = w23 (t)+ ηh3 (1 − h3 )i2 [e2 v31 + e2 v32 ]
v11 (t + 1) = v11 (t)+ ηh1 e1
v12 (t + 1) = v12 (t)+ ηh1 e2
v21 (t + 1) = v21 (t)+ ηh2 e1
v22 (t + 1) = v22 (t)+ ηh2 e2
v31 (t + 1) = v31 (t)+ ηh3 e1
v32 (t + 1) = v32 (t)+ ηh3 e2
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
ANN V
a1 (t + 1) = a1 (t) + ηe1
a2 (t + 1) = a2 (t) + ηe2
b1 (t + 1) = b1 (t) + ηh1 (1 − h1 )[e1 v11 + e2 v12 ]
b2 (t + 1) = b2 (t) + ηh2 (1 − h2 )[e1 v21 + e2 v22 ]
b3 (t + 1) = b3 (t) + ηh3 (1 − h3 )[e1 v31 + e2 v32 ]
8. Repeat steps 5 to 7 to obtain the following for 5 epochs and
check for the convergence of SSE .
9. Plot number of epochs versus SSE using plot.
10. Find the output using the final updated weight as obtained in
step 5.
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
PSO I
x
Objective: To obtain , that minimizes the function
y
(x − 0.2)2 + (y − 0.3)2 .
1. Create the arrays for INITIAL position and TENTATIVE
positions of the particles in the swarm as given in booklet.
2. Compute the functional values for all the particles for both
the INITIAL and TENTATIVE positions. Pseudo Code:
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
PSO II
for i = 1 : 1 : N
fval initial(i) = (INITIAL(1, i) − 0.2)2 + (INITIAL(2, i) − 0.3)2
fval tentative(i) =
(TENTATIVE (1, i) − 0.2)2 + (TENTATIVE (2, i) − 0.3)2
end
3. Obtain the GLOBAL best TENTATIVE position of the
particles that gives the minimum functional value.
4. Obtain the next positions of the particles using
Tutorial Session
Gaussain Mixture Model
Hidden Markov Model
Artificial Neural Network
Particle Swarm Optimization
PSO III
Tutorial Session