Vous êtes sur la page 1sur 25

Autoregressive models

Another useful model is autoregressive model.


Frequently, we find that the values of a series of financial

data at particular points in time are highly correlated with


the value which precede and succeed them.

Autoregressive models
Models with lagged variable
The creation of an autoregressive model generates a new
predictor variable by using the Y variable lagged 1 or more

periods.

yt f ( yt 1 , yt 2 ,..., yt p , t )
Dependent variable is a function of itself at the
previous moment of period or time.
2

The most often seen form of the equation is a linear


form:
p

yt b0 bi yt i et
i 1

where:
yt the dependent variable values at the moment t,
yt-i (i = 1, 2, ..., p) the dependent variable values at the
moment t-i,
bo, bi (i=1,..., p) regression coefficient,
p autoregression rank,
et disturbance term.
3

b0

b1
.

b .

..

b p

y p1

y p2
y ..
.
.
.
yn

1 y p y p1 ... y1

1 y p1 y p y 2

.
X .. .. ..
.
. . .

.
. . .

1 y y

y
n p
n1 n2

A first-order autoregressive model is concerned with only the


correlation between consecutive values in a series.

yt b0 b1 yt 1 et
A second-order autoregressive model considers the effect of
relationship between consecutive values in a series as well as
the correlation between values two periods apart.

yt b0 b1 yt 1 b2 yt 2 et
5

The selection of an appropriate autoregressive model is


not an easy task.
Once a model is selected and OLS method is used to

obtain estimates of the parameters, the next step would be


to eliminate those parameters which do not contribute
significantly.

H0; p 0
(The highest-order parameter does not contribute to the
prediction of Yt)

H1; p 0
(The highest-order parameter is significantly meaningful)

bp
S (b p )

using an alpha level of significance, the decision rule is

to reject H0 if

Z Z

and not to reject H0 if

or if

Z Z

Z Z Z
8

Some helpful information:

Z 0,1 1,645
Z 0, 05 1,960

Z 0, 02 2,236
Z 0, 01 2,576
Z 0, 001 3,291

If the null hypothesis is NOT rejected we may conclude that


the selected model contains too many estimated parameters.
The highest-order term then be deleted an a new

autoregressive model would be obtained through leastsquares regression. A test of the hypothesis that the new
highest-order term is 0 would then be repeated.

10

This testing and modeling procedure continues until we


reject H0. When this occurs, we know that our highest-order
parameter is significant and we are ready to use this model.

11

Example 1
t
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15

yt
1.89
2.46
3.23
3.95
4.56
5.07
5.62
6.16
6.26
6.56
6.98
7.36
7.53
7.84
8.09

yt b0 b1 yt 1 b2 yt 2 et
b ( X T X ) 1 X T y

p=
ytmean =
n=
k=

2
5,570667
13
2

12

t
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15

yt

yt-1

yt-2

1.89
2.46
3.23
3.95
4.56
5.07
5.62
6.16
6.26
6.56
6.98
7.36
7.53
7.84
8.09

1.89
2.46
3.23
3.95
4.56
5.07
5.62
6.16
6.26
6.56
6.98
7.36
7.53
7.84

1.89
2.46
3.23
3.95
4.56
5.07
5.62
6.16
6.26
6.56
6.98
7.36
7.53

Calculations
p=
2
ytmean = 5.570667
n=
13
k=
2

13

b0
b=

1
1
1
1
1

2,46
3,23
3,95
4,56
5,07

1,89
2,46
3,23
3,95
4,56

5,62

5,07

6,16

5,62

6,56

6,26

6,16

6,98

6,56

6,26

7,36

6,98

6,56

7,53
7,84
8,09

1
1
1

7,36
7,53
7,84

6,98
7,36
7,53

3,23
3,95
4,56
5,07
5,62

b1
b2

6,16
y=

6,26

X=

14

13
XTX=

73,58
67,63

73,58

67,63

451,3932 420,842
420,8423 393,5

79,21
XTy=

5,523661 -5,28007 4,69762


(XTX)-1=

-5,28007 5,811533 -5,30788


4,697623 -5,30788 4,87187

479,6185
446,1821

1,103369
b=

0,804936
0,08338

y t 1,1 0,8 yt 1 0,08 yt 2 et


(0,26) (0,27) (0,25)

15

t
1
2

yt

yt-1

yt-2

y^t

yt - y^t

(yt - y^t)2 yt - ytmean (yt - ytmean)2

1,89
2,46

1,89

3,23

2,46

1,89

3,2411

-0,0111

4
5
6
7
8
9

3,95
4,56
5,07
5,62
6,16
6,26

3,23
3,95
4,56
5,07
5,62
6,16

2,46
3,23
3,95
4,56
5,07
5,62

3,908428
4,552185
5,103229
5,564609
6,049848
6,530372

0,041572
0,007815
-0,03323
0,055391
0,110152
-0,27037

0,001728
6,11E-05
0,001104
0,003068
0,012134
0,073101

-1,62067
-1,01067
-0,50067
0,049333
0,589333
0,689333

10
11
12
13
14
15

6,56
6,98
7,36
7,53
7,84
8,09

6,26
6,56
6,98
7,36
7,53
7,84

6,16
6,26
6,56
6,98
7,36
7,53

6,655891
6,90571
7,268797
7,609693
7,778216
8,041921

-0,09589
0,07429
0,091203
-0,07969
0,061784
0,048079

0,009195
0,005519
0,008318
0,006351
0,003817
0,002312

0,989333 0,97878
1,409333 1,98622
1,789333 3,201714
1,959333 3,838987
2,269333 5,149874
2,519333 6,34704

0,126831

31,70494

0,000123 -2,34067

5,47872
2,62656
1,021447
0,250667
0,002434
0,347314
0,47518

16

Goodness of fit
Variance
S2 =
0,012683
Standard error of the estimate
S =
0,112619
Variance and covarince matrix
0,070057 -0,06697
2
D (b) =
-0,06697 0,073708
0,059581 -0,06732

0,059581
-0,06732
0,061791

Standard errors of the coefficients


D(b0) =

0,264684

D(b1) =

0,271493

D(b2) =

0,248577

b=

1,103369
0,804936
0,08338

Indetermination coefficient

0,004

Determination coefficient
R2 =

0,996

17

Z b2=

Calculations
Z 0,05=
0,33543

1,96

The second-order parameter does not contribute to the prediction of Y

We have to estimate the parameters of the first-order

autoregressive model:

yt b0 b1 yt 1 et
and then check if Beta1 is statistically significant.
18

t
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15

yt

yt-1

1,89
2,46
3,23
3,95
4,56
5,07
5,62
6,16
6,26
6,56
6,98
7,36
7,53
7,84
8,09

1,89
2,46
3,23
3,95
4,56
5,07
5,62
6,16
6,26
6,56
6,98
7,36
7,53
7,84

REGLINP
0,914
0,0173
99,573%
2800,6
40,241

0,904
0,09850
0,120
12
0,172

Z b1=
52,921
Z 0,05=
1,96
The first-order parameter contributes to the prediction of Y
19

Example 2
Y - annual income taxes
Year
Yt
Yt-1
1
55,4
2
61,5
55,4
3
68,7
61,5
4
87,2
68,7
5
90,4
87,2
6
86,2
90,4
7
94,7
86,2
8
103,2
94,7
9
119
103,2
10
122,4
119
11
131,6
122,4
12
157,6
131,6
13
181
157,6
14
217,8
181
15
244,1
217,8

Yt-2
55,4
61,5
68,7
87,2
90,4
86,2
94,7
103,2
119
122,4
131,6
157,6
181

Yt-3
55,4
61,5
68,7
87,2
90,4
86,2
94,7
103,2
119
122,4
131,6
157,6
20

Y - annual income taxes


Yt
Yt-1
55,4
61,5
55,4
68,7
61,5
87,2
68,7
90,4
87,2
86,2
90,4
94,7
86,2
103,2
94,7
119
103,2
122,4
119
131,6
122,4
157,6
131,6
181
157,6
217,8
181
244,1
217,8

Yt-2
55,4
61,5
68,7
87,2
90,4
86,2
94,7
103,2
119
122,4
131,6
157,6
181

Yt-3
55,4
61,5
68,7
87,2
90,4
86,2
94,7
103,2
119
122,4
131,6
157,6

Third-order autoregressive model


b3
b2
b1
b0
0,2903
-0,1987
1,1541
-11,0438
0,4485
0,5982
0,3569
10,7919
0,9753
9,7932
#N/D!
#N/D!

Z b3
0,647227
Z 0,05
1,96
The third-order parameter does not contribute to the prediction of Y

21

Y - annual income taxes


Yt
Yt-1
55,4
61,5
55,4
68,7
61,5
87,2
68,7
90,4
87,2
86,2
90,4
94,7
86,2
103,2
94,7
119
103,2
122,4
119
131,6
122,4
157,6
131,6
181
157,6
217,8
181
244,1
217,8

Yt-2
55,4
61,5
68,7
87,2
90,4
86,2
94,7
103,2
119
122,4
131,6
157,6
181

Yt-3
55,4
61,5
68,7
87,2
90,4
86,2
94,7
103,2
119
122,4
131,6
157,6

Second-order autoregressive model


b2
b1
b0
0,0220
1,1616
-7,1550
0,4000
0,3254
8,3927
0,9767
9,0609
#N/D!

Z b2
0,054917 Z 0,05
1,96
The second-order parameter does not contribute to the prediction of Y

22

Y - annual income taxes


Yt
Yt-1
55,4
61,5
55,4
68,7
61,5
87,2
68,7
90,4
87,2
86,2
90,4
94,7
86,2
103,2
94,7
119
103,2
122,4
119
131,6
122,4
157,6
131,6
181
157,6
217,8
181
244,1
217,8

Yt-2
55,4
61,5
68,7
87,2
90,4
86,2
94,7
103,2
119
122,4
131,6
157,6
181

Yt-3
55,4
61,5
68,7
87,2
90,4
86,2
94,7
103,2
119
122,4
131,6
157,6

First-order autoregressive model


b1
b0
1,1729
-5,9924
0,0494
5,9894
0,9792
8,3118

Z b1
23,74814
Z 0,05
1,96
The first-order parameter does contribute to the prediction of Y

23

Autogregressive Modeling
Used

for Forecasting
Takes Advantage of Autocorrelation
1st

order - correlation between consecutive


values
2nd order - correlation between values 2
periods apart
Random
Autoregressive

Model for pth order:

Error

Yi b0 b1Yi 1 b2Yi 2 b pYi p ei


24

Autoregressive Modeling Steps

1. Choose p:
2. Form a series of lag predictor variables
Yi-1 , Yi-2 , Yi-p
3. Use Excel to run regression model using
all p variables
4. Test significance of Bp

If null hypothesis rejected, this model is selected


If null hypothesis not rejected, decrease p by 1 and
repeat your calculations
25

Vous aimerez peut-être aussi