Vous êtes sur la page 1sur 6

(a)

Let and

Conditional likelihood

= ( ,

) , where

is the parameter vector of interest

is a vector of nuisance parameters. The conditional

likelihood can be obtained as follows: 1. Find the complete sufficient statistic

S .

2. Construct the conditional log-likelihood

l c = log f Y | S
where

f Y |S
Y2

is the conditional distribution of the response

Y = [Y 1

Yn

]t
,

given

S . Two cases might


0

occur. One is that for fixed other is that

S (

depends on

. The .The

S (

)=

is independent of

following examples illustrate the use of conditional likelihood.


Example 1:

Y1 , Y2 ,K , Yn i.i.d N , 2 f (y) =

n 2 n ( ) y i 1 exp i =1 2 2 2 2

1 2
2

n exp

i =1

2 i

yi n i =1 2 2
n

2 2

(S

,S

statistics for interest and

( , ) . Therefore, if
2

)=

i =1

Y i , Y i 2 are complete sufficient i =1


n

is the parameter of

is the nuisance parameter, the conditional density

function
f Y |S =

Y 1 , Y 2 , K , Y n given S =
fY (y ) f S (t ) 1

i =1

Y i is

n n 2 n yi yi n 2 exp i =1 2 + i =1 2 2 2 2 (t n )2 1 exp 2 2n 2 2 n

=C

( )
2

n 2 2 yi t n exp i =1 2 + 2 2 2 2 C t2 t n 2 + 2 exp 2 2 2 2n

( )
2

1 2 1
2

2 n

=C

( )
2

1 exp 2
1 exp 2 exp
n i =1

n 2 t 2 yi n i =1
2 n yi n 2 i =1 yi n i =1

=C

( )
2

=C

( )
2

(y

2 y) 2 2 i

Therefore, the conditional log-likelihood is

lc

)=

log C

[ ( )]
2

(y i
i =1

y
2

)2
,

which only depends on inference for

. Thus, we can conduct statistical

based on the conditional log-likelihood. Note that

the sufficient statistic for

S =

i =1

Y i , is independent of

Example 2:

Y1 ~ N

( 1 ,1 ), Y 2

~ N

,1 ) are independent. Suppose

2 1

is the parameter of interest and

is the

nuisance parameter. Then, the sufficient statistic for the nuisance parameter given

is

( 0 ) =

Y1 + 0Y 2

~ N

2 1

2 2

,1 +

2 0

The conditional density is

fY | S =

( y1 1 )2 + ( y 2 2 )2 1 exp 2 2 2 2 12 + 2 y1 + 0 y 2 1 1 exp 2 2 1 + 0 2 1 + 02
2

( y 2 0 y1 ( 0 )1 )2 c ( 0 ) exp 2 2 1 + 0

where

c (

)=

1+

2 0

. Thus,

2 ( y 2 0 y1 ( 0 )1 ) lc ( , 1 , 0 ) + log [c ( 0 )] . 2

2 1 + 0

Thus, we can make statistical inference for

based on

lc ( , 1 , 0 ) . For example, to find the maximum conditional


likelihood estimate, we can solve the score function

l ( , 1 , 0 ) U = c 0 = 1 ( y 0 y1 ( 0 )1 )1 = 2 ( 1) 2 2 2 1 + 0 0 =

1 ( y 2 y1 ) 1+ 2

=0

y2 y1

(intuitively, it is a sensible estimate)

General Conditional Likelihood Approach:


(I) For

S (

)=

independent of

, the conditional

log-likelihood (which only depends on

) can be obtained,

lc ( ) = log fY | S = log [ fY ( y )] log f S ( y )


Then,

].

maximizing

l c (

is the maximum conditional

likelihood estimate. To estimate the variance of Fishers information can be used,

, the conditional

I | S
Note that both

2lc ( ) = E t .

and

I | S

are, in general, different from those

derived from the full likelihood. (II) For

S (

dependent on

, the conditional log-likelihood ) can be obtained,

(which only depends on

, ,

lc ( , , 0 ) = log fY | S ( 0 ) = log [ fY ( y )] log f S ( 0 ) ( y )


Then,

].

is the solution of

l ( , , 0 ) = 0. U = c 0 = , = ( )
5

The asymptotic variance of

is the inverse of

2lc ( ) E t 0 = , = ( ) .
Note:

lc ( , , ) = l ( , )

is not the logarithm of a density and does

not ordinarily have the properties of a log-likelihood function.

Vous aimerez peut-être aussi