Académique Documents
Professionnel Documents
Culture Documents
Continuous-time processes
Thomas Verdebout
Universit Libre de Bruxelles
2015
1. A short introduction.
2. Basic probability review.
3. Martingales.
4. Markov chains.
5. Markov processes, Poisson processes.
5.1. Markov processes.
5.2. Poisson processes.
6. Brownian motions.
Markov processes
A Markov process is a continuous-time Markov chain:
Let S be a finite or countable set (indexed by i = 1, 2, ...)
Let (Xt )t0 be a SP with Xt : (, A, P) S for all t.
Definition: (Xt ) is a homogeneous Markov process (HMP) on S
t, s i, j.
Remarks:
I
Markov processes
Further remarks:
I
Markov processes
Markov processes
Proof:
Let fi (s) := P[Wt > s|Xt = i] = P[W0 > s|X0 = i] (by homogeneity).
Then, for all s1 , s2 > 0,
fi (s1 +s2 ) = P[W0 > s1 +s2 |X0 = i] = P[W0 > s1 , Ws1 > s2 |X0 = i]
= P[Ws1 > s2 |W0 > s1 , X0 = i]P[W0 > s1 |X0 = i]
= P[Ws1 > s2 |Xs1 = i]fi (s1 ) = fi (s1 )fi (s2 ).
Assume that s0 > 0 such that fi (s0 ) > 0 (if this is not the case,
(i) holds). Then
I
Now,
fi (h) fi (0)
fi (s + h) fi (s)
= fi (s) lim
= fi (s)fi0 (0).
h
h
h0
h0
Markov processes
fi0 (u)
du =
fi (u)
(i ) du = i s,
0
Markov processes
Theorem: Let i S. Then either
I
Markov processes
Markov processes
Markov processes
If (Xt ) is a conservative HMP, we can determine
I
I
The answer:
Yes, provided that (Xt ) is regular, that is, is such that
lim Tn = .
1. A short introduction.
2. Basic probability review.
3. Martingales.
4. Markov chains.
5. Markov processes, Poisson processes.
5.1. Markov processes.
5.2. Poisson processes.
6. Brownian motions.
Poisson processes
0 1 0
0 1
=
P
0
0
1
..
.
0
..
.
and
2 =
..
..
.
.
..
Poisson processes
Poisson processes
Remarks:
I
Tn
i=0 i! e
F (t) = P[Tn t] =
0
if t < 0.
(iii) For all t > 0, Nt P(t). Indeed,
k
X
(t)i t
e ,
P[Nt k ] = P[Tk +1 > t] =
i!
i=0
(t)k
k!
et .
Poisson processes
How to check (ii)?
I
Z
F Tn+1 (t) = 1 P[Tn+1 > t] = 1
P[Tn+1 > t|Tn = u]f Tn (u) du
0
Z t
= 1
P[Tn+1 > t|Tn = u]f Tn (u) du
Z 0
Poisson processes
(h)k
.
k!
Poisson processes
2 , . . ., say.
survival times W1 , W
Poisson processes
Clearly,
I
I
k .
Poisson processes
Theorem: for all t, h and k ,
P[Nt+h Nt = k | Nu , 0 u t] = eh
(h)k
.
k!
1 h + o(h)
h + o(h)
P[k events in [t, t + h)] =
o(h)
if k = 0
if k = 1
if k 2.
0
PNt
k =1 Yk
if Nt = 0
if Nt 1.
This shows that if it does not charge enough, the company will
go bankrupt a.s.
1. A short introduction.
2. Basic probability review.
3. Martingales.
4. Markov chains.
5. Markov processes, Poisson processes.
6. Brownian motions.
6.1. Definition.
6.2. Markov property, martingales.
6.3. Gaussian processes and Brownian bridges.
1. A short introduction.
2. Basic probability review.
3. Martingales.
4. Markov chains.
5. Markov processes, Poisson processes.
6. Brownian motions.
6.1. Definition.
6.2. Markov property, martingales.
6.3. Gaussian processes and Brownian bridges.
A heuristic introduction
P
Consider a symmetric RW (starting from 0) Xn = ni=1 Yi ,
where the Yi s are i.i.d. with P[Yi = 1] = P[Yi = 1] = 21 .
Now, assume that, at each t units of time, we make a step
with length x. Then, writing nt = bt/(t)c,
Xt = (x)
nt
X
Yi ,
i=1
nt
hX
nt
i
X
Yi = (x)2
Var[Yi ] = (x)2 nt ,
i=1
i=1
A heuristic introduction
What are the properties of the limiting process (Xt )t0 ?
Xt = lim t
t0
nt
X
Yi
i=1
X0 = 0.
for each RW, the increments are stationary (that is, their
distribution in [k , k + n] does not depend on k ). ; This
should also hold in the limit, i.e.
D
s, t > 0, Xt+s Xt = Xs X0 .
Definition
This leads to the following definition:
Definition: the SP (Xt )t0 is a Brownian motion
I
X0 = 0.
Xt+s Xt = Xs X0 ;
Definition
Definition
Remarks:
I
Definition
1. A short introduction.
2. Basic probability review.
3. Martingales.
4. Markov chains.
5. Markov processes, Poisson processes.
6. Brownian motions.
6.1. Definition.
6.2. Markov property, martingales.
6.3. Gaussian processes and Brownian bridges.
1
2s
2 /(2s)
e(y x)
dy .
BM and Martingales
Continuous-time martingales are defined in a similar way as for
discrete-time ones. More precisely:
The SP (Mt )t0 is a martingale w.r.t. the filtration (At )t0
I
(c) {eXt
2 t
2
)t0
BM and Martingales
BM and Martingales
(c)
h
i
2
2 t
Xt 2 t
|As
= eXs 2 E e(Xt Xs ) |As
E e
2 t
2
h
i
E e(Xt Xs )
i
h
2 t
= eXs 2 E e tsZ ,
= eXs
E e
tsZ
Z
e
tsz e
= e
2 (ts)
2
Z
R
z2
2
dz
(z
ts)2
dz = e
2 (ts)
2
,
BM and Martingales
The optional stopping theorem (OST) still holds in this
continuous-time setup, yielding results such as the following:
Proposition: let (Xt ) be a standard BM. Fix a, b > 0. Define
Tab := inf{t > 0 : Xt
/ (a, b)}. Then
I
(i) E[XTab ] = 0,
(ii) P[XTab = a] =
b
a+b ,
P[XTab = b] =
a
a+b ,
and
Proof: (i) this follows from the OST and the fact (Xt ) is a martingale.
(ii) 0 = E[XTab ] = (a) P[XTab = a] + b (1 P[XTab = a]).
Solving for P[XTab = a] yields the result.
(iii) The OST and the fact (Xt2 t) is a martingale imply that
E[XT2ab Tab ] = E[X02 0] = 0, which yields
a
b
+ b2 a+b
E[Tab ] = 0.
(a)2 a+b
BM and Martingales
2
Xt 2 t
As for the martingale e
t0
1. A short introduction.
2. Basic probability review.
3. Martingales.
4. Markov chains.
5. Markov processes, Poisson processes.
6. Brownian motions.
6.1. Definition.
6.2. Markov property, martingales.
6.3. Gaussian processes and Brownian bridges.
Xt1 X0
Xt2 Xt1
N (0, ),
..
.
Xtk Xtk 1
where = (ij ) is diagonal with ii = ti ti1 .
Hence,
Xt1
X
vi Xti = v 0 .t2 = v 0
..
i=1
Xtk
k
X
1
1
0
0
1
1
!1
1
..
.
..
.
1
Xt X0
Xt21 Xt1
..
.
Xtk Xtk 1
Brownian bridges
Let (Xt )t0 be a BM.
Definition: if (Xt ) is a BM, (Xt tX1 )0t1 is a Brownian
bridge.
Alternatively, it can be defined as a Gaussian process (over
(0, 1)) with mean function t 7 E[Xt ] = 0 and autocovariance
function (s, t) 7 Cov[Xs , Xt ] = min(s, t)(1 max(s, t))
(exercise).
Application:
Let X1 , . . . , Xn be
P i.i.d. with cdf F .
Let Fn (x) := n1 ni=1 I[Xi x] be the empirical cdf.
a.s.
Brownian bridges
Assume that X1 , . . . , Xn are i.i.d. Unif(0, 1)
(F (x) = xI[x[0,1]] + I[x>1] ).
P
Let Un (x) := 1n ni=1 I[Xi x] x , x [0, 1].
Then it can be shown that, as n ,
D
x[0,1]
Brownian bridges