Vous êtes sur la page 1sur 27

8.

φ k,k = Partial Autocorrelation Coefficient of AR(2) Process:


 φ 
   k,1  
 ρ(1) ··· ρ(k − 2) ρ(k − 1)    ρ(1) 
  φk,2  
1
 
 ρ(1)
 1 ρ(k − 3) 
 
 
ρ(k − 2)   .   ρ(2) 
 

 .. .. .. ..   ..  =  .  ,
 . . . .     .. 
    
   φk,k−1   
ρ(k − 1) ρ(k − 2) ··· ρ(1) 1   ρ(k)
φk,k
for k = 1, 2, · · ·.

37

1 ρ(1) · · · ρ(k − 2) ρ(1)


ρ(1) 1 ρ(k − 3) ρ(2)
.. .. .. ..
. . . .

ρ(k − 1) ρ(k − 2) · · · ρ(1) ρ(k)
φk,k =
1 ρ(1) · · · ρ(k − 2) ρ(k − 1)


ρ(1) 1 ρ(k − 3) ρ(k − 2)
.. .. .. ..
. . . .

ρ(k − 1) ρ(k − 2) · · · ρ(1) 1

38
Autocovariance Functions:

γ(1) = φ1 γ(0) + φ2 γ(1),

γ(2) = φ1 γ(1) + φ2 γ(0),

γ(τ) = φ1 γ(τ − 1) + φ2 γ(τ − 2), for τ = 3, 4, · · ·.

Autocorrelation Functions:
φ1
ρ(1) = φ1 + φ2 ρ(1) = ,
1 − φ2
φ21
ρ(2) = φ1 ρ(1) + φ2 = + φ2 ,
1 − φ2
ρ(τ) = φ1 ρ(τ − 1) + φ2 ρ(τ − 2), for τ = 3, 4, · · ·.

39
φ1
φ1,1 = ρ(1) =
1 − φ2

1 ρ(1)

ρ(1) ρ(2) ρ(2) − ρ(1)2
φ2,2 = = = φ2
1 ρ(1) 1 − ρ(1)2

ρ(1) 1

40

1 ρ(1) ρ(1)

ρ(1) 1 ρ(2)

ρ(2) ρ(1) ρ(3)
φ3,3 =
1 ρ(1) ρ(2)

ρ(1) 1 ρ(1)

ρ(2) ρ(1) 1
(ρ(3) − ρ(1)ρ(2)) − ρ(1)2 (ρ(3) − ρ(1)) + ρ(2)ρ(1)(ρ(2) − 1)
= = 0.
(1 − ρ(1)2 ) − ρ(1)2 (1 − ρ(2)) + ρ(2)(ρ(1)2 − ρ(2))

41
9. AR(2) +drift: yt = µ + φ1 yt−1 + φ2 yt−2 + t

Mean:

Rewriting the AR(2)+drift model,

φ(L)yt = µ + t

where φ(L) = 1 − φ1 L − φ2 L2 .

Under the stationarity assumption, we can rewrite the AR(2)+drift model as


follows:
yt = φ(L)−1 µ + φ(L)−1 t .

42
Therefore,

µ
E(yt ) = φ(L)−1 µ + φ(L)−1 E(t ) = φ(1)−1 µ =
1 − φ1 − φ2

Example: AR(p) model: Consider yt = φ1 yt−1 + φ2 yt−2 + · · · + φ p yt−p + t .

1. Variance of AR(p) Process:

Under the stationarity condition (i.e., the p solutions of x from φ(x) = 0 are
outside the unit circle),

σ2
γ(0) = .
1 − φ1 ρ(1) − · · · − φ p ρ(p)

43
Note that γ(τ) = ρ(τ)γ(0).

Solve the following simultaneous equations for τ = 0, 1, · · · , p:

γ(τ) = E((yt − µ)(yt−τ − µ)) = E(yt yt−τ )






 φ1 γ(τ − 1) + φ2 γ(τ − 2) + · · · + φ p γ(τ − p), for τ , 0,
=

 φ1 γ(τ − 1) + φ2 γ(τ − 2) + · · · + φ p γ(τ − p) + σ2 , for τ = 0.

44
2. Estimation of AR(p) Model:

1. OLS:

min (yt − φ1 yt−1 − φ2 yt−2 − · · · − φ p yt−p )2


φ1 , · · · , φ p

2. MLE:

min log f (yT , · · · , y1 )


φ1 , · · · , φ p
where

T
log f (yT , · · · , y1 ) = log f (y p , · · · , y2 , y1 ) + log f (yt |yt−1 , · · · , y1 ),
t=p+1

45
  y 
  1 
  
 1  y2 
−1 
f (y p , · · · , y2 , y1 ) = (2π) |V| −p/2 −1/2
exp − (y1 y2 · · · y p )V  . 


 2  .. 
  
yp
 ρ(1) · · · ρ(p − 2) ρ(p − 1) 
 1


 ρ(1) 1 ρ(p − 3) ρ(p − 2) 
V = γ(0)  
 
 .
. .
. .
. .
. 
. . . . 
 
ρ(p − 1) ρ(p − 2) · · · ρ(1) 1
1 ( 1 )
f (yt |yt−1 , · · · , y1 ) = √ exp − 2 (yt − φ1 yt−1 − φ2 yt−2 − · · · − φ p yt−p )2
2πσ2 2σ

46
3. Yule=Walker (ユール・ウォーカー) Equation:

Multiply yt−1 , yt−2 , · · ·, yt−p on both sides of yt = φ1 yt−1 + φ2 yt−2 + · · · +


φ p yt−p + t = yt , take expectations for each case, and divide by the sample
variance γ̂(0).
 φ 
   1  
 1 ρ̂(1) · · · ρ̂(p − 2) ρ̂(p − 1)     ρ̂(1) 
   φ2   
 ρ̂(1)
 1 ρ̂(p − 3) ρ̂(p − 2)   .   ρ̂(2) 
 .. .. .. ..   ..  =  . 
 . . . .     .. 
    
   φ p−1   
ρ̂(p − 1) ρ̂(p − 2) ··· ρ̂(1) 1   ρ̂(p)
φp

47
where

1 ∑
T
γ̂(τ)
γ̂(τ) = (yt − µ̂)(yt−τ − µ̂), ρ̂(τ) = .
T t=τ+1 γ̂(0)

3. AR(p) +drift: yt = µ + φ1 yt−1 + φ2 yt−2 + · · · φ p yt−p + t

Mean:
φ(L)yt = µ + t

where φ(L) = 1 − φ1 L − φ2 L2 − · · · − φ p L p .

yt = φ(L)−1 µ + φ(L)−1 t

48
Taking the expectation on both sides,

E(yt ) = φ(L)−1 µ + φ(L)−1 E(t ) = φ(1)−1 µ


µ
=
1 − φ1 − φ2 − · · · − φ p

4. Partial Autocorrelation of AR( p) Process:

φk,k = 0 for k = p + 1, p + 2, · · ·.

49
1.3 MA Model

MA (Moving Average) Model:

1. MA( q)
yt = t + θ1 t−1 + θ2 t−2 + · · · + θq t−q ,

which is rewritten as:


yt = θ(L)t ,

where
θ(L) = 1 + θ1 L + θ2 L2 + · · · + θq Lq .

50
2. Invertibility (反転可能性):

The q solutions of x from θ(x) = 1 + θ1 x + θ2 x2 + · · · + θq xq = 0 の q are


outside the unit circle.

=⇒ MA(q) model is rewritten as AR( ∞) model.

Example: MA(1) Model: yt = t + θ1 t−1

1. Mean of MA(1) Process:

E(yt ) = E(t + θ1 t−1 ) = E(t ) + θ1 E(t−1 ) = 0

51
2. Autocovariance Function of MA(1) Process:

γ(0) = E(y2t ) = E(t + θ1 t−1 )2 = E(t2 + 2θ1 t t−1 + θ12 t−1


2
)

= E(t2 ) + 2θ1 E(t t−1 ) + θ12 E(t−1


2
) = (1 + θ12 )σ2

γ(1) = E(yt yt−1 ) = E((t + θ1 t−1 )(t−1 + θ1 t−2 )) = θ1 σ2

γ(2) = E(yt yt−2 ) = E((t + θ1 t−1 )(t−2 + θ1 t−3 )) = 0

52
3. Autocorrelation Function of MA(1) Process:
 θ

 1
γ(τ)  1 + θ2 ,
 for τ = 1,
ρ(τ) = =
γ(0) 
1


 0, for τ = 2, 3, · · ·.

Let x be ρ(1).
θ1
= x, i.e., xθ12 − θ + x = 0.
1 + θ12
θ1 should be a real number.
1 1
1 − 4x2 > 0, i.e., − ≤ ρ(1) ≤ .
2 2

53
4. Invertibility Condition of MA(1) Process:

t = −θ1 t−1 + yt

= (−θ1 )2 t−2 + yt + (−θ1 )yt−1

= (−θ1 )3 t−3 + yt + (−θ1 )yt−1 + (−θ1 )2 yt−2


..
.

= (−θ1 ) s t−s + yt + (−θ1 )yt−1 + (−θ1 )2 yt−2 + · · · + (−θ1 )t−s+1 yt−s+1

54
When (−θ1 ) s t−s −→ 0, the MA(1) model is written as the AR(∞) model,
i.e.,

yt = −(−θ1 )yt−1 − (−θ1 )2 yt−2 − · · · − (−θ1 )t−s+1 yt−s+1 − · · · + t

5. Likelihood Function of MA(1) Process:

The autocovariance functions are: γ(0) = (1 + θ12 )σ2 , γ(1) = θ1 σ2 , and
γ(τ) = 0 for τ = 2, 3, · · ·.

The joint distribution of y1 , y2 , · · · , yT is:


( )
1 −1/2 1 0 −1
f (y1 , y2 , · · · , yT ) = |V| exp − Y V Y
(2π)T/2 2

55
where
 1 + θ2 θ1 ··· 
y   1 0 0 
 1   .. .. 

   θ1 1 + θ12 θ1 . .
 y2   
2 .. .. 
Y =  .  , V = σ  0 θ1 . . 0  .

 ..   . ... ... 
   .. 1 + θ12 θ1 
 
yT 
0 ··· 0 θ1 1 + θ1
2

56
6. MA(1) +drift: yt = µ + t + θ1 t−1

Mean of MA(1) Process:


yt = µ + θ(L)t ,

where θ(L) = 1 + θ1 L.

Taking the expectation,

E(yt ) = µ + θ(L)E(t ) = µ.

57
Example: MA(2) Model: yt = t + θ1 t−1 + θ2 t−2

1. Autocovariance Function of MA(2) Process:





 (1 + θ12 + θ22 )σ2 , for τ = 0,








 (θ1 + θ1 θ2 )σ ,
2
for τ = 1,
γ(τ) = 



 θ2 σ2 , for τ = 2,






 0, otherwise.

2. let −1/β1 and −1/β2 be two solutions of x from θ(x) = 0.

For invertibility condition, both β1 and β2 shoulg be less than one in absolute
value.

58
Then, the MA(2) model is represented as:

yt = t + θ1 t−1 + θ2 t−2

= (1 + θ1 L + θ2 L2 )t

= (1 + β1 L)(1 + β2 L)t

AR( ∞) representation of the MA(2) model is given by:


1
t = yt
(1 + β1 L)(1 + β2 L)
( )
β1 /(β1 − β2 ) −β2 /(β1 − β2 )
= + yt
1 + β1 L 1 + β2 L

59
3. Likelihood Function:
( )
1 −1/2 1 0 −1
f (y1 , y2 , · · · , yT ) = |V| exp − Y V Y
(2π)T/2 2

where
 1 + θ2 + θ2 θ1 + θ1 θ2 θ2 
y   1 2 0 
 1   ... 
   θ1 + θ1 θ2 1 + θ12 + θ22 θ1 + θ1 θ2 
 y2   
2 ... ... 
Y =  .  , V = σ  θ2 θ1 + θ1 θ2 θ2 
 ..   .. .. 
   . . 1 + θ12 + θ22 θ1 + θ1 θ2 
yT  
0 θ2 θ1 + θ1 θ2 1 + θ1 + θ2
2 2

60
4. MA(2) +drift: yt = µ + t + θ1 t−1 + θ2 t−2

Mean:
yt = µ + θ(L)t ,

where θ(L) = 1 + θ1 L + θ2 L2 .

Therefore,

E(yt ) = µ + θ(L)E(t ) = µ

61
Example: MA( q) モデル: yt = t + θ1 t−1 + θ2 t−2 + · · · + θq t−q

1. Mean of MA(q) Process:

E(yt ) = E(t + θ1 t−1 + θ2 t−2 + · · · + θq t−q ) = 0

2. Autocovariance Function of MA(q) Process:





 ∑
q−τ


 σ (θ0 θτ + θ1 θτ+1 + · · · + θq−τ θq ) = σ θi θτ+i , τ = 1, 2, · · · , q,
2
 2
γ(τ) = 




i=0

 0, τ = q + 1, q + 2, · · · ,

where θ0 = 1.

62
3. MA( q) process is stationary.

4. MA( q) +drift: yt = µ + t + θ1 t−1 + θ2 t−2 + · · · + θq t−q

Mean:
yt = µ + θ(L)t ,

where θ(L) = 1 + θ1 L + θ2 L2 + · · · + θq Lq .

Therefore, we have:

E(yt ) = µ + θ(L)E(t ) = µ.

63

Vous aimerez peut-être aussi