Vous êtes sur la page 1sur 1

S.

van de Geer Fundamentals of Mathematical Statistics AS 2017

Series 2

1. Suppose that X1 and X2 are i.i.d. copies of an exponentially distributed random variable X with
parameter > 0. The density of X is then given by
 x
e , if x 0,
fX (x) =
0, otherwise.

a) Compute the density of the sum S = X1 + X2 using the convolution formula.


b) Compute the density of the sum S = X1 + X2 using characteristic functions.
c) Give an explicit expression for the conditional density fX1 ,X2 (x1 , x2 |S = s).

2. Suppose that X1 , . . . , Xn are i.i.d. copies of a Bernoulli random variable X with parameter 0 < < 1.
The distribution of X is then given by

P (X = 1) = 1 P (X = 0) = .
Pn
a) What is the distribution of the sum S = i=1 Xi ?
b) Give an explicit expression for the conditional distribution P (X1 = x1 , . . . , Xn = xn |S = s).

3. (Fishers theorem) Let X1 , . . . , Xn N (0, 1) be i.i.d. random variables. Prove the following:

a) nXn N (0, 1),
Pn
b) Xn and i=1 (Xi Xn )2 are independent,
Pn 2 2
c) i=1 (Xi Xn ) n1 .

Hint: Start with the vector e1 = (1/ n, . . . , 1/ n), e1 Rn , so that the Euclidean norm of e1
is ke1 k = 1. Recall that by Gram-Schmidt orthogonalization it is always possible to find vectors
e2 , . . . , en Rn , such that the matrix

e1
e2
W = . .

..
en

is orthogonal; i.e., WT W = In . Note that to solve this exercise, you do not need to explicitly
calculate e2 , . . . , en . Next, let

X1 Y1
X2 Y2
X = . and Y = WX = . .

.. ..
Xn Yn

then X N (0, In ),Pwhere 0T = (0, 0, . . . , 0). What is the distribution and the Euclidean norm
n 2
of Y? Show that i=1 (Xi Xn ) = kYk2 Y12 . Use the previous questions to prove (a)-
(c).
Preliminary discussion: Tuesday, October 03.

Vous aimerez peut-être aussi