Vous êtes sur la page 1sur 43

Sequential Monte Carlo Methods: a Survey

Elise Arnaud
perception - inria Rhne-Alpes o 655, avenue de lEurope 38330 Montbonnot, France

arc fantastik, 8 mars. 2007, Kick-off meeting

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

1 / 26

Outline

Introduction Sequential Monte Carlo methods in theory


Problem Statement Overview of existing solutions Principle of smcm and Generic Particle lter Choice of the optimal importance function Resampling

Conclusion and references

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

2 / 26

what do we want to do ?
calculate the state of a system from a set of observations

c mercator Elise Arnaud (elise.arnaud@inrialpes.fr) Sequential Monte Carlo Methods 3 / 26

Applications of Particle ltering

positioning, navigation and tracking target tracking computer vision mobile robotics ambient intelligence sensor networks, etc.

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

4 / 26

Applications of Particle ltering

among others ... nance data assimilation


environmental sciences (oceanography, meteorology, atmospheric pollution)

simulation of rare events

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

5 / 26

Problem Statement
Dynamic system modeled as a Hidden Markov Chain x0
n

x1
n n

xk1
n

xk
n

z1 described by
1 2 3

zk1

zk

the initial distribution p(x0 ) an evolution model p(xk |x0:k1 , z1:k1 ) = p(xk |xk1 ) a likelihood p(zk |x0:k1 , z1:k1 ) = p(zk |xk )

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

6 / 26

Problem Statement

Filtering - tracking: estimation of the state given the past and present measurements p(xk |z1:k ) Smoothing: estimation of the state given the past and some future measurements p(xk |z1:t ) t>k Prediction: estimation of a future state given the measurements up to a past time p(xk |z1:t ) t<k

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

7 / 26

Problem Statement

Filtering - tracking: estimation of the state given the past and present measurements p(xk |z1:k ) Smoothing: estimation of the state given the past and some future measurements p(xk |z1:t ) t>k Prediction: estimation of a future state given the measurements up to a past time p(xk |z1:t ) t<k

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

7 / 26

Problem Statement
Toy exemple : the white car tracking

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

8 / 26

Problem Statement
Toy exemple : the white car tracking

state xk : position + velocity evolution model: the car evolves at constant velocity observation zk : detected white cars observation model: the tracked car should be one of the detected cars

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

8 / 26

Problem Statement
Toy exemple : the white car tracking

state xk : position + velocity evolution model: the car evolves at constant velocity observation zk : detected white cars observation model: the tracked car should be one of the detected cars

p(xk |z1:k ) current position of the white car knowing all previous and current detected white cars
Elise Arnaud (elise.arnaud@inrialpes.fr) Sequential Monte Carlo Methods 8 / 26

Problem Statement
Objective: Sequential estimation of the ltering distribution p(xk |z1:k ) i.e. estimation of p(xk |z1:k ) knowing p(xk1 |z1:k1 )

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

9 / 26

Problem Statement
Objective: Sequential estimation of the ltering distribution p(xk |z1:k ) i.e. estimation of p(xk |z1:k ) knowing p(xk1 |z1:k1 )

Optimal Bayesian Filter


1

prediction: p(xk |z1:k1 ) = p(xk |xk1 ) p(xk1 |z1:k1 ) dxk1

update: p(xk |z1:k ) = p(zk |xk ) p(xk |z1:k1 ) p(zk |xk ) p(xk |z1:k1 ) dxk

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

9 / 26

Problem Statement
Objective: Sequential estimation of the ltering distribution p(xk |z1:k ) i.e. estimation of p(xk |z1:k ) knowing p(xk1 |z1:k1 )

Optimal Bayesian Filter


1

prediction: p(xk |z1:k1 ) = p(xk |xk1 ) p(xk1 |z1:k1 ) dxk1

update: p(xk |z1:k ) = p(zk |xk ) p(xk |z1:k1 ) p(zk |xk ) p(xk |z1:k1 ) dxk

Problem: Computation of the two integrals


Elise Arnaud (elise.arnaud@inrialpes.fr) Sequential Monte Carlo Methods 9 / 26

Overview of existing solutions


Linear Gaussian models

xk = Fk xk1 + wk wk N (0; Qk ) zk = Hk xk + vk vk N (0; Rk )

ltering and predictive laws are Gaussians analytic expressions of means and covariances using the equations of the optimal Bayesian solution Kalman lter [Kalman 60] [Anderson 79] Optimal propagation of the Gaussian probability density p(xk |z1:k )
Elise Arnaud (elise.arnaud@inrialpes.fr) Sequential Monte Carlo Methods 10 / 26

2.2 Overview of existing solutions


Linear Gaussian models: Kalman lter from p(xk1 |z1:k1 ) to p(xk |z1:k ) prediction xk |z1:k1 N (xk ; k|k1 , k|k1 ) x k|k1 = Fk k1|k1 x x
t k|k1 = Fk k1|k1 Fk + Qk

update xk |z1:k N (xk ; k|k , k|k ) x


t t Kk = k|k1 Hk (Hk k|k1 Hk + Rk )1

k|k = k|k1 + Kk [zk (Hk k|k1 )] x x x k|k = (Id Kk Hk ) k|k1


Elise Arnaud (elise.arnaud@inrialpes.fr) Sequential Monte Carlo Methods 11 / 26

Overview of existing solutions


Non linear - non Gaussian case Extensions of the Kalman lter
Extended Kalman [Jazwinski70] - Unscented Kalman lter [Julier 97] Based on a Gaussian approximation of the ltering density Adapted to weakly non linear and unimodal case

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

12 / 26

Overview of existing solutions


Non linear - non Gaussian case Extensions of the Kalman lter
Extended Kalman [Jazwinski70] - Unscented Kalman lter [Julier 97] Based on a Gaussian approximation of the ltering density Adapted to weakly non linear and unimodal case

Grid based methods


Discretization of the state space on a grid Fixed or adaptative grid [Kitagawa 87] [Sorenson 88] [Cai 95] Highly computational method, small dimension (d < 4)

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

12 / 26

Overview of existing solutions


Non linear - non Gaussian case Extensions of the Kalman lter
Extended Kalman [Jazwinski70] - Unscented Kalman lter [Julier 97] Based on a Gaussian approximation of the ltering density Adapted to weakly non linear and unimodal case

Grid based methods


Discretization of the state space on a grid Fixed or adaptative grid [Kitagawa 87] [Sorenson 88] [Cai 95] Highly computational method, small dimension (d < 4)

Sequential Monte Carlo methods (particle lters)


Independant of the dimension and of the non linearity of the system

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

12 / 26

Principle of smcm and Generic Particle lter


First document relative to the Monte Carlo methods: [Metropolis 49] Key article for Sequential Monte Carlo methods: [Gordon 93] Principle: approximation of the ltering density by a swarm of N weigthed (i) (i) particles (a particle is an element of the state space) {xk , wk }i=1:N p(xk |z1:k )
i=1:N

wk x(i) (xk )
k

(i)

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

13 / 26

Principle of smcm and Generic Particle lter


First document relative to the Monte Carlo methods: [Metropolis 49] Key article for Sequential Monte Carlo methods: [Gordon 93] Principle: approximation of the ltering density by a swarm of N weigthed (i) (i) particles (a particle is an element of the state space) {xk , wk }i=1:N p(xk |z1:k )
i=1:N

wk x(i) (xk )
k

(i)

a particle is a possible solution, its weight represents its quality


Elise Arnaud (elise.arnaud@inrialpes.fr) Sequential Monte Carlo Methods 13 / 26

Principle of smcm and Generic Particle lter


Objective: obtaining p(xk |z1:k ) from p(xk1 |z1:k1 ) ie

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

14 / 26

Principle of smcm and Generic Particle lter


Objective: obtaining p(xk |z1:k ) from p(xk1 |z1:k1 ) ie obtaining {xk , wk }i=1:N from {xk1 , wk1 }i=1:N
(i) (i) (i) (i)

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

14 / 26

Principle of smcm and Generic Particle lter


Objective: obtaining p(xk |z1:k ) from p(xk1 |z1:k1 ) ie obtaining {xk , wk }i=1:N from {xk1 , wk1 }i=1:N
(i) (i) (i) (i)

Generic Particle lter


1 2 3

exploration of the state space evaluation of the particles quality with respect to the observations mutation/selection of the particles

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

14 / 26

Principle of smcm and Generic Particle lter


Objective: obtaining {xk , wk }i=1:N from {xk1 , wk1 }i=1:N
(i) (i) (i) (i)

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

15 / 26

Principle of smcm and Generic Particle lter


Objective: obtaining {xk , wk }i=1:N from {xk1 , wk1 }i=1:N
(i) (i) (i) (i)

Generic Particle lter

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

15 / 26

Principle of smcm and Generic Particle lter


Objective: obtaining {xk , wk }i=1:N from {xk1 , wk1 }i=1:N
(i) (i) (i) (i)

Generic Particle lter


1

Sampling

(exploration)

xk (xk |x0:k1 , z1:k )

(i)

(i)

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

15 / 26

Principle of smcm and Generic Particle lter


Objective: obtaining {xk , wk }i=1:N from {xk1 , wk1 }i=1:N
(i) (i) (i) (i)

Generic Particle lter


1

Sampling

(exploration)

xk (xk |x0:k1 , z1:k )

(i)

(i)

Calculation of weights wk wk1


(i) (i)

(evaluation - comes from importance sampling principle) (i) (i) (i)

p(zk |xk ) p(xk |xk1 )


(i) (i) (xk |x0:k1 , z1:k ) i=1:N

wk = 1

(i)

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

15 / 26

Principle of smcm and Generic Particle lter


Objective: obtaining {xk , wk }i=1:N from {xk1 , wk1 }i=1:N
(i) (i) (i) (i)

Generic Particle lter


1

Sampling

(exploration)

xk (xk |x0:k1 , z1:k )

(i)

(i)

Calculation of weights wk wk1


(i) (i)

(evaluation - comes from importance sampling principle) (i) (i) (i)

p(zk |xk ) p(xk |xk1 )


(i) (i) (xk |x0:k1 , z1:k ) i=1:N

wk = 1

(i)

Resampling

(mutation-selection)

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

15 / 26

Choice of the importance function (xk |x0:k1 , z1:k )


Bootstrap lter - condensation algorithm [Gordon 93] [Isard 96]
1 2 3

(i)

xk p(xk |xk1 ) wk = p(zk |xk ) systematic resampling


(i) (i)

(i)

(i)

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

16 / 26

Choice of the importance function (xk |x0:k1 , z1:k )


Bootstrap lter - condensation algorithm [Gordon 93] [Isard 96]
1 2 3

(i)

xk p(xk |xk1 ) wk = p(zk |xk ) systematic resampling


(i) (i)

(i)

(i)

very simple algorithm but ... the space exploration does not take into account the measurements.

sample impoverishment
Elise Arnaud (elise.arnaud@inrialpes.fr)

wrong exploration
16 / 26

Sequential Monte Carlo Methods

Choice of the importance function (xk |x0:k1 , z1:k )

(i)

Optimal Particle Filter [Doucet 00]


1 2 3

xk p(xk |xk1 , zk ) wk = wk1 p(zk |xk1 ) adaptative resampling


(i) (i) (i)

(i)

(i)

this optimal importance function minimizes the variance of the weights

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

17 / 26

Choice of the importance function (xk |x0:k1 , z1:k )

(i)

Optimal Particle Filter [Doucet 00]


1 2 3

xk p(xk |xk1 , zk ) wk = wk1 p(zk |xk1 ) adaptative resampling


(i) (i) (i)

(i)

(i)

this optimal importance function minimizes the variance of the weights the use of this algorithm is rarely possible in practice. possible for partial linear Gaussian models (linear measurement models, additive Gaussian noises) [Doucet 00] [Arnaud 05] [Arnaud 06]

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

17 / 26

Choice of the importance function (xk |x0:k1 , z1:k )


Alternative suboptimal solutions ... to migrate the particles towards high likelihood regions Extended / Unscented Particle lter: approximation of the otimal importance function by an ekf or an ukf [Doucet 00] Auxiliary Particle lter [Pitt 01] Likelihood sampling [Fox 01] [Torma 04] Introduction of a Monte Carlo step [Musso 01] [Godsill 01] Use of some image features to construct the importance function [Sullivan 01] [Perez 04]

(i)

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

18 / 26

Resampling

Resampling: key step of Sequential Monte Carlo approaches Basic idea of the resampling at time k, multiply particles xk with high weights wk and discard ones with small weights give to the resampled particles an equal weight keep the number of particles xed
(i) (i)

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

19 / 26

Resampling

Resampling scheme Multinomial resampling: copy xk ni times where (1) (N) (n1 , ..., nN ) M(N; wk , ..., wk ) [Doucet 00] Residual resampling [Liu 98] Stratied resampling [Carpenter 99] Branching resampling [Crisan 99] Mixture resampling [Vermaak03] ...
(i)

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

20 / 26

Data fusion
one of the major advantage of particle ltering may be done throught the likelihood denoting z the measurements associated to dierent modalities, and k p(z |xk ) their likelohood, the global likelihood is : k

p(zk |xk ) =
=1

p(z |xk ). k

or some cues can be used in the sampling step, and the others in the evaluation step

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

21 / 26

conclusion: why particle ltering ?

Simple management of several tracking hypotheses, even in high dimensional space Constraint-free modelization
enable the introducing of any a priori information enable a simple fusion of dierent image cues

Multimodality, non linearity and non Gaussian noises are not a problem ... Easy to implement

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

22 / 26

Conclusion
advantages of particle ltering easy to implement, and to extand robust to clutter and occlusions plenty of theoretical results problems of particle ltering jitter of the nal estimate computational cost short term multimodality

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

23 / 26

Conclusion
advantages of particle ltering easy to implement, and to extand robust to clutter and occlusions plenty of theoretical results problems of particle ltering jitter of the nal estimate computational cost short term multimodality A good algorithm does not x a weak model and vice versa
Elise Arnaud (elise.arnaud@inrialpes.fr) Sequential Monte Carlo Methods 23 / 26

References
The book Sequential Monte Carlo methods in practice ed. Doucet, de Freitas, Gordon, Springer-Verlag, 2001 smcm website http://www-sigproc.eng.cam.ac.uk/smc/ Survey on smcm [Arulampalam 02] [Doucet 00] Convergence [Crisan 02] [Del Moral 02] Multi target tracking [Vermaak 05] Target tracking [Ristic 04] Tracking in image sequences [Isard 98] [Perez 04]

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

24 / 26

References

[Arulampalam 02] M.S. Arulampalam, S. Maskell, N. Gordon, T. Clapp. A tutorial on particles lters for online nonlinear / non-Gaussian Bayesian tracking. ieee Transactions on Signal Processing, 50(2):174188, 2002. [Crisan02] D. Crisan, A. Doucet. Survey of convergence results on particle ltering methods for practitioners. ieee Transactions on Signal Processing, 50(3):736746, 2002. [Del Moral 02] Del Moral P.and Miclo L. Asymptotic Stability of Non Linear Semigroup of Feynman-Kac Type. Annales de la Facult des Sciences de Toulouse, 2002 e [Doucet01] A. Doucet, N. de Freitas, and N. Gordon, editors. Sequential Monte Carlo methods in practice. New York: Springer-Verlag, Series Statistics for Engineering and Information Science, 2001.

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

25 / 26

References

[Isard 98] M. Isard, A. Blake. Condensation conditional density propagation for visual International Journal of Computer Vision, 29(1):528, 1998. [Perez04] P. Prez, J. Vermaak, and A. Blake. Data fusion for visual tracking with particles. Proceedings of the ieee (issue on State Estimation), 92(3):495513, 2004. [Ristic04] B. Ristic, S. Arulampalam, N. Gordon. Beyond the Kalman Filter: Particle Filters for Tracking Applications. Artech House Publishers, 2004. [Vermaak02] J. Vermaak, C. Andrieu, A. Doucet, and S.J. Godsill. Particle methods for Bayesian modeling and enhancement of speech signals. ieee Transactions on Speech and Audio Processing, 10(3):173185, 2002.

Elise Arnaud (elise.arnaud@inrialpes.fr)

Sequential Monte Carlo Methods

26 / 26

Vous aimerez peut-être aussi