Académique Documents
Professionnel Documents
Culture Documents
Chapter 4
Multivariate distributions
k≥2
Multivariate Distributions
All the results derived for the bivariate case can be generalized to n
RV.
1
RS – 4 – Multivariate Distributions
1. 0 p x1 , , xn 1
2. p x , , x 1
x1 xn
1 n
3. P X , , X A p x , , x
1 n 1 n
x1 , , xn A
1. f x1 , , xn 0
2.
f x , , x dx , , dx
1 n 1 n 1
A
3. P X 1 , , X n A f x1 , , xn dx1 , , dxn
2
RS – 4 – Multivariate Distributions
n!
p x1 , , xn p1x1 p2x2 pkxk
x1 ! x2 ! xk !
n! n
x1 ! x2 ! xk ! x1 x2 xk
3
RS – 4 – Multivariate Distributions
4!
p x, y , z 0.30 0.50 0.20
x y z
x yz 4
x! y! z !
4
RS – 4 – Multivariate Distributions
z
Table: p(x,y,z) x y 0 1 2 3 4
0 0 0 0 0 0 0.0016
0 1 0 0 0 0.0160 0
0 2 0 0 0.0600 0 0
0 3 0 0.1000 0 0 0
0 4 0.0625 0 0 0 0
1 0 0 0 0 0.0096 0
1 1 0 0 0.0720 0 0
1 2 0 0.1800 0 0 0
1 3 0.1500 0 0 0 0
1 4 0 0 0 0 0
2 0 0 0 0.0216 0 0
2 1 0 0.1080 0 0 0
2 2 0.1350 0 0 0 0
2 3 0 0 0 0 0
2 4 0 0 0 0 0
3 0 0 0.0216 0 0 0
3 1 0.0540 0 0 0 0
3 2 0 0 0 0 0
3 3 0 0 0 0 0
3 4 0 0 0 0 0
4 0 0.0081 0 0 0 0
4 1 0 0 0 0 0
4 2 0 0 0 0 0
4 3 0 0 0 0 0
4 4 0 0 0 0 0
z
P [X + Y ≥ Z] x y 0 1 2 3 4
0 0 0 0 0 0 0.0016
= 0.9728 0
0
1
2
0
0
0
0
0
0.0600
0.0160
0
0
0
0 3 0 0.1000 0 0 0
0 4 0.0625 0 0 0 0
1 0 0 0 0 0.0096 0
1 1 0 0 0.0720 0 0
1 2 0 0.1800 0 0 0
1 3 0.1500 0 0 0 0
1 4 0 0 0 0 0
2 0 0 0 0.0216 0 0
2 1 0 0.1080 0 0 0
2 2 0.1350 0 0 0 0
2 3 0 0 0 0 0
2 4 0 0 0 0 0
3 0 0 0.0216 0 0 0
3 1 0.0540 0 0 0 0
3 2 0 0 0 0 0
3 3 0 0 0 0 0
3 4 0 0 0 0 0
4 0 0.0081 0 0 0 0
4 1 0 0 0 0 0
4 2 0 0 0 0 0
4 3 0 0 0 0 0
4 4 0 0 0 0 0
5
RS – 4 – Multivariate Distributions
12 x
2
1
f x e
2
the bivariate normal distribution
x x 2 x x x y x y 2
12
1 x x y y
f x, y
2 1 2
e
2 x y 1 2
1 12 x μ 1 x μ
f x1 , , xk f x e
2
k /2
1/ 2
where
x1 1 11 12 1k
x 22 2 k
x 2 μ 2 12
xk k 1k 2 k kk
6
RS – 4 – Multivariate Distributions
p12q x1 , , xq p x , , x 1 n
xq 1 xn
7
RS – 4 – Multivariate Distributions
f x1 , , xk f1q x1 , , xq f q 1k xq 1 , , xk
f x1 , , xk f1 x1 f 2 x2 f k xk
8
RS – 4 – Multivariate Distributions
f x, y , z
K x yz
2
0 x 1, 0 y 1, 0 z 1
0 otherwise
0 0 0
x 1
1 1
x3 1 1
1
K xyz dydz K yz dydz
0 0 x0 0 0
3 3
y 1
1
1
y2
1
1 1
K y z dz K z dz
0
0
3 2 y 0 3 2 12
if K
1 7
z z2 1 1 7
K K K 1
3 4 0 3 4 12
9
RS – 4 – Multivariate Distributions
x
12
f1 x f x, y , z dydz 2
yz dydz
7 0 0
y 1
12
1
2 y2 12
1
1
dz x
2
x y z z dz
7 0 2 y0 7 0
2
1
12 2 z2 12 2 1
x z x fo r 0 x 1
7 4 0 7 4
z 1
12 2 z2
x z y
7 2 z 0
12 2 1
x y for 0 x 1, 0 y 1
7 2
10
RS – 4 – Multivariate Distributions
f x, y , z
12 2
x yz
7
f12 x , y 12 2 1
x y
7 2
x 2 yz
for 0 z 1
1
x2 y
2
11
RS – 4 – Multivariate Distributions
12 2 1
f1 x x for 0 x 1
7 4
f x, y , z
12 2
7
x yz
f1 x 12 2 1
x
7 4
x 2 yz
for 0 y 1, 0 z 1
1
x 2
Definition: Expectation
Let X1, X2, …, Xn denote n jointly distributed random variable with
joint density function
f(x1, x2, …, xn )
then
E g X 1 , , X n
g x
1 , , xn f x1 , , x n d x1 , , d x n
12
RS – 4 – Multivariate Distributions
f x, y, z
12 2
7 x yz 0 x 1, 0 y 1, 0 z 1
0 otherwise
Determine E[XYZ].
Solution:
1 1 1
E X YZ xyz
12
7
x 2 yz dxdydz
0 0 0
1 1 1
x
12
3
yz xy 2 z 2 dxdydz
7 0 0 0
1 1 1
12 2
E XYZ xyz x
12
x yz dxdydz 3
yz xy 2 z 2 d xd yd z
0 0 0
7 7 0 0 0
x 1
1 1
x4 x2 2 2
1 1
yz 2 y
12 3
0 0 4 dydz 2
yz y z z 2 dydz
7 2 x0 7 0 0
y 1
3
1
y2 y3 2 3 1
1
2 2
7 0 2 z 2 3 z dz 7 0 2 z 3 z dz
y0
1
3 z2 2z3 3 1 2 3 17 17
7 4 9 0 7 4 9 7 36 84
13
RS – 4 – Multivariate Distributions
Thus you can calculate E[Xi] either from the joint distribution of
X1, … , Xn or the marginal distribution of Xi.
Proof:
x i f x1 , , x n dx1 , , dx n
x i f x1 , , x n dx1 dx i 1 dx i 1 dx n dx i
x i f i x i dx i
2. E a1 X 1 a n X n a1 E X 1 a n E X n
This property is called the Linearity property.
Proof:
a x
1 1 a n x n f x1 , , x n dx1 dx n
a1 x1 f x1 , , x n dx1 dx n
an x n f x1 , , x n dx1 dx n
14
RS – 4 – Multivariate Distributions
E g X 1 , , X q h X q 1 , , X k
E g X 1 , , X q E h X q 1 , , X k
E XY E X E Y
h xq 1 , , xk f 2 xq 1 , , xk g x1 , , xq
f1 x1 , , xq dx1 dxq dxq 1 dxk
E g X 1 , , X q
h x
q 1 , , x k f 2 x q 1 , , x k d x q 1 d x k
15
RS – 4 – Multivariate Distributions
E g X 1 , , X q
hx
q 1 , , x k f 2 x q 1 , , x k dx q 1 dx k
E g X 1 , , X q E h X q 1 , , X k
Proof:
X E X Y
2
V ar Y
X Y
w h e re X Y E X Y X Y
Thus,
V ar X Y E X Y X Y
2
E X X 2X X Y Y Y Y
2 2
V ar X 2 C o v X , Y V ar Y
16
RS – 4 – Multivariate Distributions
= E X X E Y Y
= E X X E Y 0 Y
C ov X ,Y C ov X ,Y
xy =
V ar X V a r Y X Y
T hus C ov X ,Y = XY X Y
and V ar X Y 2
X 2
Y 2 XY X Y
2
X 2
Y
if X and Y are independent.
17
RS – 4 – Multivariate Distributions
The converse is not necessarily true. That is, XY = 0 does not imply
that X and Y are independent.
Example:
y\x 6 8 10 fy(y)
E(X)=8, E(Y)=2, E(XY)=16
1 .2 0 .2 .4 Cov(X,Y) =16 – 8*2 = 0
2 0 .2 0 .2
3 .2 0 .2 .4 P(X=6,Y=2)=0≠P(X=6)*P(Y=2)=.4*
fx(x) .4 .2 .4 1 *.2=.08=> X&Y are not independent
g b E V bU 0
2
Let for all b.
We will pick b to minimize g(b).
g b E V bU E V 2 2 bVU b 2U 2
2
E V 2 2 bE VU b 2 E U 2
18
RS – 4 – Multivariate Distributions
E V U
g b 2 E VU 2 bE U 2 0 => b b m in
E U
2
E V 2
0
E U 2
E VU
2
E V 2
0
E U 2
E VU
2
Thus, 1
E U 2 E V 2
E X
X Y Y
2
or XY
2
1
E X X E Y Y
2 2
=> 1 X Y 1
19
RS – 4 – Multivariate Distributions
E V b m in U 0
2
If and only if X2 Y 1
i.e., P V bminU 0 1
P Y b X a 1
E X X Y X
where b b m in
E X X
2
C o v X , Y XY X Y
= = XY Y
V ar X X2
X
Y
an d a Y b m in X Y X Y
X X
20
RS – 4 – Multivariate Distributions
Proof
Var aX bY E aX bY aX bY
2
with aX bY E aX bY a X bY
Thus,
Var aX bY E aX bY a X bY
2
E a X X 2ab X X Y Y b2 Y Y
2 2 2
a2Var X 2abCov X , Y b2Var Y
i 1 i j
n
ai2 Var X i if X 1 , , X n are mutually independent
i 1
21
RS – 4 – Multivariate Distributions
E X i 1 p 0 q p
2 Var[ X i ] (1 p) 2 p (0 p) 2 q (1 p) 2 p (0 p) 2 (1 p)
(1 p) ( p p 2 p 2 ) qp
X E X 1 E X n p p np
X2 var X 1 var X n pq pq npq
22
RS – 4 – Multivariate Distributions
Conditional Expectation
f x1 , , xk
f1q q 1k x1 , , xq xq 1 , , xk
f q 1k xq 1 , , xk
23
RS – 4 – Multivariate Distributions
E U xq 1 , , xk
h x , , x f
1 k 1q q 1k x , , x
1 q
xq 1 , , xk dx1 dxq
f x, y , z
7 x yz
12 2
0 x 1, 0 y 1, 0 z 1
0 otherwise
Determine the conditional expectation of U = X 2 + Y + Z given
X = x, Y = y.
12 2 1
f12 x, y x y for 0 x 1, 0 y 1
7 2
24
RS – 4 – Multivariate Distributions
f x, y , z
12 2
7
x yz
f12 x, y 12 2 1
x y
7 2
x 2 yz
for 0 z 1
1
x2 y
2
E U x , y x y z 2 1 dz
2
x 2y
0
1
1
2 1 x 2 y z x 2 yz dz
x 2y0
yz dz
1
1
2 1
x 2y
2
y x 2 y x 2 z x 2 x 2 y
0
z 1
z3 z2
1
2 1 y y x y x
x 2y 3
2 2
x2 x2 y z
2 z 0
1
1
x 2 y 3
2 1
2 2 1
2
y y x y x x x y
2 2
25
RS – 4 – Multivariate Distributions
2
y y x y x x x y
2 2
y x2
1
x 2 y3 2
2
x 2 12 y x 2
y
1
x 2 13 y
1
2 1 x2 y
2
x 2y
Theorem
Let (x1, x2, … , xq, y1, y2, … , ym) = (x, y) denote q + m RVs.
Let U(x1, x2, … , xq, y1, y2, … , ym) = g(x, y). Then,
E U E y E U y
26
RS – 4 – Multivariate Distributions
Thus U g X ,Y
E U g x, y f x, y dxdy
E U Y E g X ,Y Y g x, y fX Y x y dx
f x, y dx
g x, y
fY y
hence EY E U Y E U y fY y dy
f x, y
g x , y dx fY y dy
f Y y
g x, y f x, y dx dy
g x, y f x, y dxdy E U
27
RS – 4 – Multivariate Distributions
Var U E U 2 E U
2
2
EY E U 2 Y EY E U Y
EY Var U Y E U Y E 2
E U Y
2
Y
EY Var U Y EY E U Y EY E U Y
2 2
EY Var U Y VarY E U Y
Example:
Suppose that a rectangle is constructed by first choosing its length, X
and then choosing its width Y.
Its length X is selected form an exponential distribution with mean
= 1/ = 5. Once the length has been chosen its width, Y, is selected
from a uniform distribution form 0 to half its length.
Find the mean and variance of the area of the rectangle A = XY.
28
RS – 4 – Multivariate Distributions
Solution:
f X x 15 e
15 x
for x 0
fY X y x
1
if 0 y x 2
x 2
f x , y f X x fY X y x
15 x 1 15 x
15 e = 2
5x e if 0 y x 2 , x 0
x 2
15 x 15 x
xy 2
5x e dydx 2
5 ye dydx
0 0 0 0
E A 2 E X 2Y 2 x 2 y 2 f x , y dxdy
x 2 x 2
15 x 15 x
x y2 2 2
5x e dydx 2
5 xy 2 e dydx
0 0 0 0
and Var A E A 2 E A
2
29
RS – 4 – Multivariate Distributions
x 2 yx 2
y2
E A ye e
15 x 15 x
2
5 dydx 2
5 dx
0 0 0 y 0
2
3 15 x 2e x dx
3
x e 15 0 3
2 15 x
dx
1
2 1 1 5
5 8 20 3
0
3
1
53
2 125
10
25
12.5
20 20 2
1 3
5
5 15 x 4 e
5
x 15 0 5
15 x
dx
1 x
2 1 1 4 1
5 3 8 e 60 5
5
dx
0
5
1
55
4! 54
24 5 4 2 1250
60 60 12
1 5
5
T h u s V a r A E A 2 E A
2
1 2 5 0 1 2 .5 1 0 9 3 .7 5
2
30
RS – 4 – Multivariate Distributions
E A E XY E X E XY X
X 1 2
Now E XY X XE Y X X X
4 4
2 X 2 0
2
Thus E A E XY E X E XY X
E X 14 X 2 14 E X X 2 14 2
k!
Note k for the exponential distn
k
2 25
Thus E A 14 2 1
12.5
15
4 2
2
31
RS – 4 – Multivariate Distributions
E X Var XY X VarX E XY X
4! 54
E X Var XY X E X 1
X
4 1
4 1
15
48 48 48 4
2
14 E X X 4 E X X 2 14 4 2
2 2 2 2
2
14
2
4! 2! 54 4! 2! 2 54
20 55
1 4
1 2 42 42 4
5 5
E X Var XY X VarX E XY X
5 4 55 1 5 14
54 54 1093.75
2 4 2 4 8
32
RS – 4 – Multivariate Distributions
m X (t ) E X [exp( t' X )]
where t’= (t1, t2, … , tq) and X= (X1, X2, … , Xq )’.
where t= (t1, t2, … , tq)’, X= (X1, X2, … , Xq )’ and μ= (μ1, μ2, … , μq )’.
33
RS – 4 – Multivariate Distributions
d h 1 (u )
g u f h 1 (u ) du
f x
dx
du
34
RS – 4 – Multivariate Distributions
d x1 , , x n
g u1 , , u n f x1 , , x n
d u1 , , u n
f x1 , , x n J
d x1 d x1
du
dun
d x1 , , xn 1
where J det
d u 1 , ,un
dxn
dxn
d u 1 d u n
Jacobian of the transformation
35
RS – 4 – Multivariate Distributions
u u2 u1 u 2 1
f1 1 f2
2 2 2
u1 u 2 u u2 dv 1
put v th e n 1 u1 v ,
2 2 du2 2
36
RS – 4 – Multivariate Distributions
Hence
u u2 u1 u 2 1
g 1 u 1 f1 1 f2 du2
2 2 2
f1 v f 2 u 1 v d v
f e
- (u-y)
g U (u ) U ( u y ) f Y ( y ) dy e - y dy
0
u
e - u dy 2 ue - u
2
0
This is the gamma distribution when α=2.
37
RS – 4 – Multivariate Distributions
x 2
1
f2 y e 2 2
2
u v 2
1
e
v
2 2
e dv
0 2
38
RS – 4 – Multivariate Distributions
u v 2
v
or g u 2 2
e dv
2 0
u v 2 2 2 v
e 2 2
dv
2 0
v 2 2 u v u 2 2 2 v
e 2 2
dv
2 0
u 2 v 2 2 u 2
v
2
e 2 2
2
e dv
2 0
2 2
u 2 u 2 v 2 2 u 2 v u 2
2
e 2 2
e
0
2 2
dv
2 2
u 2
u 2 v 2 2 u 2 v u 2
1
e 2 2
0 2
e 2 2
dv
2
u 2 u 2
e 2 2
P V 0
39
RS – 4 – Multivariate Distributions
V u 2
and variance 2.
That is,
u
2
2 u
g u e 2
1
2
g(u)
0.06
0.03
0
0 10 20 30
40
RS – 4 – Multivariate Distributions
Theorem 7.2. Let the n×1 vector y ~N(0, In). Then y′y ~n.
Theorem 7.3. Let the n×1 vector y ~N(0, σ2 In) and M be a symmetric
idempotent matrix of rank m. Then,
y′My/σ2 ~tr(M)
41
RS – 4 – Multivariate Distributions
Theorem 7.5
Let the n×1 vector y ~ N(0, I) and M be an n×n matrix. Then, the
characteristic function of y′My is |I-2itM|-1/2
Proof:
1 ity'My y' y / 2 1
y'My Ey [eity'My] e e dx e y'(I 2itM) y / 2dx.
(2) y
n/ 2
(2) y
n/ 2
Theorem 7.6
Let the n×1 vector y ~ N(0, I), M be an n×n idempotent matrix of
rank m, let L be an n×n idempotent matrix of rank s, and suppose ML
= 0. Then y ′My and y′Ly are independently distributed variables.
Proof:
By Theorem 7.3 both quadratic forms distributed variables. We
only need to prove independence. From Theorem 7.5, we have
y'My Ey [eity'My] | I 2itM |1/ 2
y'Ly Ey [eity'Ly ] | I 2itL |1/ 2
The forms will be independently distributed if φy’(M+L)y = φy’My φy’Ly
That is,
y'(ML) y Ey[eity'(ML) y ] | I 2it(M L) |1/ 2 | I 2itM |1/ 2| I 2itL|1/ 2
Since |ML|=|M||L|, the above result will be true only when ML=0.
42