Vous êtes sur la page 1sur 7

Carnegie Mellon University Department of Statistics

36-705 Intermediate Statistics Homework #4


Solutions
October 6, 2016

Problem 1 [30 pts.]


(a) [20 pts.] Find a minimal sufficient statistic.
p(y n ; , )
A statistic T is minimal sufficient if does not depend on , iff T (y n ) =
p(xn ; , )
T (xn ). Suppose X1 , . . . , Xn N (, ) and Y1 , . . . , Yn N (, ). We will show that
n
SX ) is minimal sufficient, where SX = 1 (Xi X)(X
(X, i X) .
T
n i=1
p(y n ; , )
Let us start by deriving an expression for . We see that
p(xn ; , )
1
n
p(y ; , ) ni=1 [(2)d/2 1/2 ] exp { 12 (yi )T 1 (yi )}
=
p(xn ; , ) 1
ni=1 [(2)d/2 1/2 ] exp { 21 (xi )T 1 (xi )}

exp { ni=1 [ 12 (yi )T 1 (yi )]}


=
exp { ni=1 [ 12 (xi )T 1 (xi )]}


1 n 1 n
= exp (y (y + (x (x

T 1 T 1
i ) i ) i ) i )

2 i=1 2 i=1


n
Note that (yi )T is 1 d, 1 is d d, and (yi ) is d 1. So (yi )T 1 (yi )
i=1
is a scalar and will equal its own trace. Let us rewrite ni=1 (yi )T 1 (yi ).
n n
(yi ) (yi ) = (yi y + y ) (yi y + y )
T 1 T 1
i=1 i=1

n
= tr(yi y + y ) (yi y + y )
T 1
i=1





n

= tr1 (yi y + y )(yi y + y )T
i=1







n
1 T
= tr (yi y)(yi y) + 2(yi y)(
T
y ) + (
T
y )
y )(
i=1







n
n

= tr1 (yi y)(yi y)T + tr21 (yi y)(y )T
i=1 i=1






n

+ tr1 (y )( y )T
i=1

1
36-705 Intermediate Statistics: Homework 4






n
n

= tr1 (yi y)(yi y)T + tr(
y )T 21 (yi y)





i=1 i=1



n

+ tr( y )T 1 (
y )
i=1


= tr{ nSy } + 0 + n(
1
y ) (
T 1
y )
= ntr{1 Sy } + n(
y )T 1 (
y )

Now we see that



p(y n ; , ) 1

= exp ntr{1 Sy } + n(y )T 1 (y )
p(xn ; , )
2




ntr{1 Sx } n( x ).
x )T 1 (


p(y n ; , )
Now we can show that = y and Sx = Sy .
does not depend on , iff x
p(xn ; , )
p(y n ; , )
() Suppose that does not depend on , . Define
p(xn ; , )

1


A = ntr{1 Sy } + n( y ) ntr{1 Sx } n(
y )T 1 ( x )T 1 (
x ).
2



p(y n ; , )
Since does not depend on , we know that
p(xn ; , )

p(y n ; , )
0= n
= exp[A] A.
p(x ; , )

That means that



0= A


1 1

= n(
y ) T 1
(
y ) + n(
x ) T 1
(
x )
2 2


1 1
= n( y)T 1 ( y) + n( x )
)T 1 ( x
2 2



n 1
n 1
1 T
= ( y) + ( ) ( y) + ( x 1 T
) + ( ) ( x)
2 2


n n
= 1 ( y) + (T ) ( y) + 1 ( x )
1 1
) + (T ) ( x
2 2

2
36-705 Intermediate Statistics: Homework 4


n n
= 1 ( y) + 1 ( y) + 1 ( x )
) + 1 ( x
2 2



= n1 ( y) + n1 ( x
)

Thus, we see that




n1 ( y) = n1 ( x
)


( y) = ( x
1 1
)
1 ( y) = 1 ( x
)
y = x

y = x
.

p(y n ; , )
Also, since does not depend on , we know that
p(xn ; , )

p(y n ; , )
0= n
= exp[A] A.
p(x ; , )

That means that



0= A


1 1
= ntr{ Sy } + ntr{ Sx }
1 1
2 2

n n
= { Sy } + { Sx }.
1 1 1 1
2 2
So
n 1 n
{ Sx 1 } = {1 Sy 1 }
2 2
Sx = 1 Sy 1
1 1

1 Sx 1 = 1 Sy 1
Sx = Sy .

= y and Sx = Sy .
Thus, we have shown that x

3
36-705 Intermediate Statistics: Homework 4

() Suppose that x
= y and Sx = Sy . Then

p(y n ; , ) 1
= exp ntr{1 Sy } + n(
y )T 1 (
y )
n
p(x ; , ) 2




ntr{ Sx } n(
1
x ) (
T 1
x )




1

= exp ntr{1 Sx } + n(x )T 1 (
x )
2





ntr{ Sx } n(
1
x ) (
T 1
x )



= exp(0)
= 1.
p(y n ; , )
So does not depend on , .
p(xn ; , )
We conclude that (X, SX ) is a minimal sufficient statistic.

(b) [10 pts.] Show that X1 + X2 is not a sufficient statistic.

Since (X,
SX ) is minimal sufficient for (, ), if X1 + X2 were sufficient there would be
a function f such that (X,
SX ) = f (X1 + X2 ), which is clearly impossible.

Problem 2 [30 pts.]


(a) [20 pts.] Find the distribution of (X1 , X2 ) given T , where T = max{X1 , X2 }.
Since X1 and X2 are independent, we see that
fX1 ,X2 (x1 , x2 ) = fX1 (x1 )fX2 (x2 )
1 1
= ( ) I(X1 ) ( ) I(X2 )

2
1
= ( ) I(max{X1 , X2 } )

1 2
= ( ) I(T ).

Also,
0 T = max{x1 , x2 } t
fX1 ,X2 ,T (x1 , x2 , t) = {
fX1 ,X2 (x1 , x2 ) T = max{x1 , x2 } = t
0 T = max{x1 , x2 } t
={ 1 2
( ) I(T ) T = max{x1 , x2 } = t

4
36-705 Intermediate Statistics: Homework 4

Next, let us solve for fT (t). We determine


FT (t) = P(T t) = P(X1 t X2 t) = P(X1 t)P(X2 t).
We know that

0 t<0


P(X1 t) = t 0t

1

t>
So

t<0



0
2
FT (t) = ( t ) 0 t




1 t>
Differentiating with respect to t, we see that fT (t) will be non-zero on the interval
0 t . Specifically,
2t
fT (t) = 2 , 0 t .

Hence fX1 ,X2 T (x1 , x2 t) is defined where t = max{x1 , x2 } (and hence t is in the interval
0 t ). For 0 x1 , 0 x2 , and t = max{x1 , x2 },
fX1 ,X2 ,T (x1 , x2 , t)
fX1 ,X2 T (x1 , x2 t) =
fT (t)
(1/) 2
=
2t/2
1
=
2t
We conclude that
1
0 x1 , 0 x2 , t = max{x1 , x2 }
fX1 ,X2 T (x1 , x2 t) = { 2t
0 else

(b) [10 pts.] Show that X1 + X2 is not sufficient.


Consider the ratio
p(y1 , y2 ; )
R(xn , y n ; ) =
p(x1 , x2 ; )
1{ max{Y
2
( 1 )
1 ,Y2 } }
=
1{ max{X
2
( 1 )
1 ,X2 } }
1{ max{Y }
1 ,Y2 }
= . (1)
1{ max{X }
1 ,X2 }

(1) is independent of if and only if max{X1 , X2 } = max{Y1 , Y2 }, so max{X1 , X2 } is a


minimal sufficient statistic for . If X1 + X2 were sufficient there would be a function f
such that max{X1 , X2 } = f (X1 + X2 ), which is clearly impossible.

5
36-705 Intermediate Statistics: Homework 4

Problem 3
n
1
L() = ( )1
i=1 3 { Xi 2}
1 n
=( ) 1
3 { X(1) , X(n) /2}
1 n
=( ) 1
3 { max{X(1) , X(n) /2}}

Problem 4
By linearity of the expectation we have that:
n
= 1 E[Xi ] = 1 n =
E []
n i=1 n

So bias = 0. Moreover, since Xi s are independent:


n
= 1 V (Xi ) = 1 n =
V ()
n2 i=1 n2 n

such that se = = 02 +
and MSE()
= n .
n n

Problem 5 [10 pts.]

= E [2X n ]
E []
= 2E [X n ]

=2
2
= ,

= E []
so Bias () = 0.


=
se () V (2X n )

= 4V (X n )

2
= 4
12n

=
3n

6
36-705 Intermediate Statistics: Homework 4

Thus,
E[( )2 ] = Bias2 ()
2 + se2 ()


2
= .
3n

Problem 6 [30 pts.]


(a) [10 pts.] The first and second moments of X1 Uniform (a, b) are:
a+b
E [X1 ] =
2
b1 a2 + ab + b2
E [X12 ] = dx =
x2
a ba 3
Let 1 = n1 ni=1 Xi and
2 = n1 ni=1 Xi2 be the first and second sample moments. By
solving the following system of equations
1 =
b+a
{ 2
a2 +ab+b2
2 =
3
we obtain the following estimators:

=
a 1 3 ( 12 )
2

b =
1 + 3 ( 12 )
2
To obtain a 1 a = b, replacing
for example note that the first equation gives us that 2
this in the second equation:
a2 + a (2 1 a)2
1 a) + (2
=
2
3
Expanding the square, cancelling an regrouping we obtain:
1 )2 = 3 (
(a 12 )
2

=
So that a 1 3 ( 12 ). b is obtained similarly.
2

(b) [10 pts.] The likelihood function of x = (x1 , ..., xn ) is:


n
L (a, b) = (b a)1 I(a,b) (xi ) = (b a)n I(,x(1) ) (a)I(x(n) ,) (b)
i=1

By inspection this is maximized when b a is as small as possible while also satisfying


= X(1) , b = X(n) .
a x(1) . . . x(n) b. So the MLE estimators are a

(c) [10 pts.] We have E [X1 ] = a+b


2 . By equivariance property of the MLE we have that
the MLE of is:
+ b X(1) + X(n)
a
= =
2 2

Vous aimerez peut-être aussi