Académique Documents
Professionnel Documents
Culture Documents
Elise Arnaud
perception - inria Rhne-Alpes o 655, avenue de lEurope 38330 Montbonnot, France
1 / 26
Outline
2 / 26
what do we want to do ?
calculate the state of a system from a set of observations
positioning, navigation and tracking target tracking computer vision mobile robotics ambient intelligence sensor networks, etc.
4 / 26
5 / 26
Problem Statement
Dynamic system modeled as a Hidden Markov Chain x0
n
x1
n n
xk1
n
xk
n
z1 described by
1 2 3
zk1
zk
the initial distribution p(x0 ) an evolution model p(xk |x0:k1 , z1:k1 ) = p(xk |xk1 ) a likelihood p(zk |x0:k1 , z1:k1 ) = p(zk |xk )
6 / 26
Problem Statement
Filtering - tracking: estimation of the state given the past and present measurements p(xk |z1:k ) Smoothing: estimation of the state given the past and some future measurements p(xk |z1:t ) t>k Prediction: estimation of a future state given the measurements up to a past time p(xk |z1:t ) t<k
7 / 26
Problem Statement
Filtering - tracking: estimation of the state given the past and present measurements p(xk |z1:k ) Smoothing: estimation of the state given the past and some future measurements p(xk |z1:t ) t>k Prediction: estimation of a future state given the measurements up to a past time p(xk |z1:t ) t<k
7 / 26
Problem Statement
Toy exemple : the white car tracking
8 / 26
Problem Statement
Toy exemple : the white car tracking
state xk : position + velocity evolution model: the car evolves at constant velocity observation zk : detected white cars observation model: the tracked car should be one of the detected cars
8 / 26
Problem Statement
Toy exemple : the white car tracking
state xk : position + velocity evolution model: the car evolves at constant velocity observation zk : detected white cars observation model: the tracked car should be one of the detected cars
p(xk |z1:k ) current position of the white car knowing all previous and current detected white cars
Elise Arnaud (elise.arnaud@inrialpes.fr) Sequential Monte Carlo Methods 8 / 26
Problem Statement
Objective: Sequential estimation of the ltering distribution p(xk |z1:k ) i.e. estimation of p(xk |z1:k ) knowing p(xk1 |z1:k1 )
9 / 26
Problem Statement
Objective: Sequential estimation of the ltering distribution p(xk |z1:k ) i.e. estimation of p(xk |z1:k ) knowing p(xk1 |z1:k1 )
update: p(xk |z1:k ) = p(zk |xk ) p(xk |z1:k1 ) p(zk |xk ) p(xk |z1:k1 ) dxk
9 / 26
Problem Statement
Objective: Sequential estimation of the ltering distribution p(xk |z1:k ) i.e. estimation of p(xk |z1:k ) knowing p(xk1 |z1:k1 )
update: p(xk |z1:k ) = p(zk |xk ) p(xk |z1:k1 ) p(zk |xk ) p(xk |z1:k1 ) dxk
ltering and predictive laws are Gaussians analytic expressions of means and covariances using the equations of the optimal Bayesian solution Kalman lter [Kalman 60] [Anderson 79] Optimal propagation of the Gaussian probability density p(xk |z1:k )
Elise Arnaud (elise.arnaud@inrialpes.fr) Sequential Monte Carlo Methods 10 / 26
12 / 26
12 / 26
12 / 26
wk x(i) (xk )
k
(i)
13 / 26
wk x(i) (xk )
k
(i)
14 / 26
14 / 26
exploration of the state space evaluation of the particles quality with respect to the observations mutation/selection of the particles
14 / 26
15 / 26
15 / 26
Sampling
(exploration)
(i)
(i)
15 / 26
Sampling
(exploration)
(i)
(i)
wk = 1
(i)
15 / 26
Sampling
(exploration)
(i)
(i)
wk = 1
(i)
Resampling
(mutation-selection)
15 / 26
(i)
(i)
(i)
16 / 26
(i)
(i)
(i)
very simple algorithm but ... the space exploration does not take into account the measurements.
sample impoverishment
Elise Arnaud (elise.arnaud@inrialpes.fr)
wrong exploration
16 / 26
(i)
(i)
(i)
17 / 26
(i)
(i)
(i)
this optimal importance function minimizes the variance of the weights the use of this algorithm is rarely possible in practice. possible for partial linear Gaussian models (linear measurement models, additive Gaussian noises) [Doucet 00] [Arnaud 05] [Arnaud 06]
17 / 26
(i)
18 / 26
Resampling
Resampling: key step of Sequential Monte Carlo approaches Basic idea of the resampling at time k, multiply particles xk with high weights wk and discard ones with small weights give to the resampled particles an equal weight keep the number of particles xed
(i) (i)
19 / 26
Resampling
Resampling scheme Multinomial resampling: copy xk ni times where (1) (N) (n1 , ..., nN ) M(N; wk , ..., wk ) [Doucet 00] Residual resampling [Liu 98] Stratied resampling [Carpenter 99] Branching resampling [Crisan 99] Mixture resampling [Vermaak03] ...
(i)
20 / 26
Data fusion
one of the major advantage of particle ltering may be done throught the likelihood denoting z the measurements associated to dierent modalities, and k p(z |xk ) their likelohood, the global likelihood is : k
p(zk |xk ) =
=1
p(z |xk ). k
or some cues can be used in the sampling step, and the others in the evaluation step
21 / 26
Simple management of several tracking hypotheses, even in high dimensional space Constraint-free modelization
enable the introducing of any a priori information enable a simple fusion of dierent image cues
Multimodality, non linearity and non Gaussian noises are not a problem ... Easy to implement
22 / 26
Conclusion
advantages of particle ltering easy to implement, and to extand robust to clutter and occlusions plenty of theoretical results problems of particle ltering jitter of the nal estimate computational cost short term multimodality
23 / 26
Conclusion
advantages of particle ltering easy to implement, and to extand robust to clutter and occlusions plenty of theoretical results problems of particle ltering jitter of the nal estimate computational cost short term multimodality A good algorithm does not x a weak model and vice versa
Elise Arnaud (elise.arnaud@inrialpes.fr) Sequential Monte Carlo Methods 23 / 26
References
The book Sequential Monte Carlo methods in practice ed. Doucet, de Freitas, Gordon, Springer-Verlag, 2001 smcm website http://www-sigproc.eng.cam.ac.uk/smc/ Survey on smcm [Arulampalam 02] [Doucet 00] Convergence [Crisan 02] [Del Moral 02] Multi target tracking [Vermaak 05] Target tracking [Ristic 04] Tracking in image sequences [Isard 98] [Perez 04]
24 / 26
References
[Arulampalam 02] M.S. Arulampalam, S. Maskell, N. Gordon, T. Clapp. A tutorial on particles lters for online nonlinear / non-Gaussian Bayesian tracking. ieee Transactions on Signal Processing, 50(2):174188, 2002. [Crisan02] D. Crisan, A. Doucet. Survey of convergence results on particle ltering methods for practitioners. ieee Transactions on Signal Processing, 50(3):736746, 2002. [Del Moral 02] Del Moral P.and Miclo L. Asymptotic Stability of Non Linear Semigroup of Feynman-Kac Type. Annales de la Facult des Sciences de Toulouse, 2002 e [Doucet01] A. Doucet, N. de Freitas, and N. Gordon, editors. Sequential Monte Carlo methods in practice. New York: Springer-Verlag, Series Statistics for Engineering and Information Science, 2001.
25 / 26
References
[Isard 98] M. Isard, A. Blake. Condensation conditional density propagation for visual International Journal of Computer Vision, 29(1):528, 1998. [Perez04] P. Prez, J. Vermaak, and A. Blake. Data fusion for visual tracking with particles. Proceedings of the ieee (issue on State Estimation), 92(3):495513, 2004. [Ristic04] B. Ristic, S. Arulampalam, N. Gordon. Beyond the Kalman Filter: Particle Filters for Tracking Applications. Artech House Publishers, 2004. [Vermaak02] J. Vermaak, C. Andrieu, A. Doucet, and S.J. Godsill. Particle methods for Bayesian modeling and enhancement of speech signals. ieee Transactions on Speech and Audio Processing, 10(3):173185, 2002.
26 / 26