Vous êtes sur la page 1sur 172

Principles of Econometrics, 4t

h
Edition Page 1
Chapter 9: Regression with Time Series Data:
Stationary Variables
Chapter 9
Regression with Time Series Data:
Stationary Variables
Walter R. Paczkowski
Rutgers University
Principles of Econometrics, 4t
h
Edition Page 2
Chapter 9: Regression with Time Series Data:
Stationary Variables

9.1 Introduction
9.2 Finite Distributed Lags
9.3 Serial Correlation
9.4 Other Tests for Serially Correlated Errors
9.5 Estimation with Serially Correlated Errors
9.6 Autoregressive Distributed Lag Models
9.7 Forecasting
9.8 Multiplier Analysis
Chapter Contents
Principles of Econometrics, 4t
h
Edition Page 3
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.1
Introduction
Principles of Econometrics, 4t
h
Edition Page 4
Chapter 9: Regression with Time Series Data:
Stationary Variables
When modeling relationships between variables,
the nature of the data that have been collected has
an important bearing on the appropriate choice of
an econometric model
Two features of time-series data to consider:
1. Time-series observations on a given
economic unit, observed over a number of
time periods, are likely to be correlated
2. Time-series data have a natural ordering
according to time
9.1
Introduction
Principles of Econometrics, 4t
h
Edition Page 5
Chapter 9: Regression with Time Series Data:
Stationary Variables

There is also the possible existence of dynamic
relationships between variables
A dynamic relationship is one in which the
change in a variable now has an impact on that
same variable, or other variables, in one or
more future time periods
These effects do not occur instantaneously but
are spread, or distributed, over future time
periods
9.1
Introduction
Principles of Econometrics, 4t
h
Edition Page 6
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.1
Introduction
FIGURE 9.1 The distributed lag effect
Principles of Econometrics, 4t
h
Edition Page 7
Chapter 9: Regression with Time Series Data:
Stationary Variables
Ways to model the dynamic relationship:
1. Specify that a dependent variable y is a
function of current and past values of an
explanatory variable x


Because of the existence of these lagged
effects, Eq. 9.1 is called a distributed lag
model
9.1
Introduction
9.1.1
Dynamic Nature of
Relationships
1 2
( , , ,...)
t t t t
y f x x x

= Eq. 9.1
Principles of Econometrics, 4t
h
Edition Page 8
Chapter 9: Regression with Time Series Data:
Stationary Variables
Ways to model the dynamic relationship (Continued):
2. Capturing the dynamic characteristics of time-
series by specifying a model with a lagged
dependent variable as one of the explanatory
variables

Or have:


Such models are called autoregressive
distributed lag (ARDL) models, with
autoregressive meaning a regression of y
t

on its own lag or lags

9.1
Introduction
9.1.1
Dynamic Nature of
Relationships
Eq. 9.2
1
( , )
t t t
y f y x

=
Eq. 9.3
1 1 2
( , , , )
t t t t t
y f y x x x

=
Principles of Econometrics, 4t
h
Edition Page 9
Chapter 9: Regression with Time Series Data:
Stationary Variables
Ways to model the dynamic relationship (Continued):
3. Model the continuing impact of change over
several periods via the error term


In this case e
t
is correlated with e
t - 1
We say the errors are serially correlated or
autocorrelated
9.1
Introduction
9.1.1
Dynamic Nature of
Relationships
Eq. 9.4
1
( ) ( )
t t t t t
y f x e e f e

= + =
Principles of Econometrics, 4t
h
Edition Page 10
Chapter 9: Regression with Time Series Data:
Stationary Variables
The primary assumption is Assumption MR4:


For time series, this is written as:


The dynamic models in Eqs. 9.2, 9.3 and 9.4
imply correlation between y
t
and y
t - 1
or e
t
and
e
t - 1
or both, so they clearly violate assumption
MR4
9.1
Introduction
9.1.2
Least Squares
Assumptions
( ) ( )
cov , cov , 0 for
i j i j
y y e e i j = = =
( ) ( )
cov , cov , 0 for
t s t s
y y e e t s = = =
Principles of Econometrics, 4t
h
Edition Page 11
Chapter 9: Regression with Time Series Data:
Stationary Variables



A stationary variable is one that is not explosive,
nor trending, and nor wandering aimlessly without
returning to its mean
9.1
Introduction
9.1.2a
Stationarity
Principles of Econometrics, 4t
h
Edition Page 12
Chapter 9: Regression with Time Series Data:
Stationary Variables



9.1
Introduction
9.1.2a
Stationarity
FIGURE 9.2 (a) Time series of a stationary variable
Principles of Econometrics, 4t
h
Edition Page 13
Chapter 9: Regression with Time Series Data:
Stationary Variables



9.1
Introduction
9.1.2a
Stationarity
FIGURE 9.2 (b) time series of a nonstationary variable that is slow-turning
or wandering
Principles of Econometrics, 4t
h
Edition Page 14
Chapter 9: Regression with Time Series Data:
Stationary Variables



9.1
Introduction
9.1.2a
Stationarity
FIGURE 9.2 (c) time series of a nonstationary variable that trends
Principles of Econometrics, 4t
h
Edition Page 15
Chapter 9: Regression with Time Series Data:
Stationary Variables



9.1
Introduction
9.1.3
Alternative Paths
Through the
Chapter
FIGURE 9.3 (a) Alternative paths through the chapter starting with finite
distributed lags
Principles of Econometrics, 4t
h
Edition Page 16
Chapter 9: Regression with Time Series Data:
Stationary Variables



9.1
Introduction
9.1.3
Alternative Paths
Through the
Chapter
FIGURE 9.3 (b) Alternative paths through the chapter starting with
serial correlation
Principles of Econometrics, 4t
h
Edition Page 17
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.2
Finite Distributed Lags
Principles of Econometrics, 4t
h
Edition Page 18
Chapter 9: Regression with Time Series Data:
Stationary Variables

Consider a linear model in which, after q time
periods, changes in x no longer have an impact on
y


Note the notation change:
s
is used to denote
the coefficient of x
t-s
and is introduced to
denote the intercept
0 1 1 2 2 t t t t q t q t
y x x x x e

= o +| +| +| + +| +
Eq. 9.5
9.2
Finite Distributed
Lags
Principles of Econometrics, 4t
h
Edition Page 19
Chapter 9: Regression with Time Series Data:
Stationary Variables

Model 9.5 has two uses:
Forecasting


Policy analysis
What is the effect of a change in x on y?
1 0 1 1 2 1 1 1 T T T T q T q T
y x x x x e
+ + + +
= o +| +| +| + +| + Eq. 9.6
( ) ( )
t t s
s
t s t
E y E y
x x
+

c c
= =|
c c
Eq. 9.7
9.2
Finite Distributed
Lags
Principles of Econometrics, 4t
h
Edition Page 20
Chapter 9: Regression with Time Series Data:
Stationary Variables
Assume x
t
is increased by one unit and then
maintained at its new level in subsequent periods
The immediate impact will be
0

the total effect in period t + 1 will be
0
+
1
, in
period t + 2 it will be
0
+
1
+
2
, and so on
These quantities are called interim
multipliers
The total multiplier is the final effect on y of
the sustained increase after q or more periods
have elapsed
0

q
s
s=

9.2
Finite Distributed
Lags
Principles of Econometrics, 4t
h
Edition Page 21
Chapter 9: Regression with Time Series Data:
Stationary Variables
The effect of a one-unit change in x
t
is distributed
over the current and next q periods, from which
we get the term distributed lag model
It is called a finite distributed lag model of
order q
It is assumed that after a finite number of
periods q, changes in x no longer have an
impact on y
The coefficient
s
is called a distributed-lag
weight or an s-period delay multiplier
The coefficient
0
(s = 0) is called the impact
multiplier
9.2
Finite Distributed
Lags
Principles of Econometrics, 4t
h
Edition Page 22
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.2
Finite Distributed
Lags
9.2.1
Assumptions
TSMR1.
TSMR2. y and x are stationary random variables, and e
t
is independent of
current, past and future values of x.
TSMR3. E(e
t
) = 0
TSMR4. var(e
t
) =
2
TSMR5. cov(e
t
, e
s
) = 0 t s
TSMR6. e
t
~ N(0,
2
)
0 1 1 2 2
, 1, ,
t t t t q t q t
y x x x x e t q T o

= + + + + + + = +
ASSUMPTIONS OF THE DISTRIBUTED LAG MODEL
Principles of Econometrics, 4t
h
Edition Page 23
Chapter 9: Regression with Time Series Data:
Stationary Variables
Consider Okuns Law
In this model the change in the unemployment
rate from one period to the next depends on the
rate of growth of output in the economy:


We can rewrite this as:


where DU = U = U
t
- U
t-1
,
0
= -, and
= G
N

9.2
Finite Distributed
Lags
9.2.2
An Example:
Okuns Law
( )
1 t t t N
U U G G

=
Eq. 9.8
0

t t t
DU G e o = + + Eq. 9.9
Principles of Econometrics, 4t
h
Edition Page 24
Chapter 9: Regression with Time Series Data:
Stationary Variables

We can expand this to include lags:



We can calculate the growth in output, G, as:
9.2
Finite Distributed
Lags
9.2.2
An Example:
Okuns Law
Eq. 9.10
0 1 1 2 2

t t t t q t q t
DU G G G G e o

= + + + + + +
Eq. 9.11
1
1
100
t t
t
t
GDP GDP
G
GDP

=
Principles of Econometrics, 4t
h
Edition Page 25
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.2
Finite Distributed
Lags
9.2.2
An Example:
Okuns Law
FIGURE 9.4 (a) Time series for the change in the U.S. unemployment rate:
1985Q3 to 2009Q3
Principles of Econometrics, 4t
h
Edition Page 26
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.2
Finite Distributed
Lags
9.2.2
An Example:
Okuns Law
FIGURE 9.4 (b) Time series for U.S. GDP growth: 1985Q2 to 2009Q3
Principles of Econometrics, 4t
h
Edition Page 27
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.2
Finite Distributed
Lags
9.2.2
An Example:
Okuns Law
Table 9.1 Spreadsheet of Observations for Distributed Lag Model
Principles of Econometrics, 4t
h
Edition Page 28
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.2
Finite Distributed
Lags
9.2.2
An Example:
Okuns Law
Table 9.2 Estimates for Okuns Law Finite Distributed Lag Model
Principles of Econometrics, 4t
h
Edition Page 29
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.3
Serial Correlation
Principles of Econometrics, 4t
h
Edition Page 30
Chapter 9: Regression with Time Series Data:
Stationary Variables


When is assumption TSMR5, cov(e
t
, e
s
) = 0 for
t s likely to be violated, and how do we assess
its validity?
When a variable exhibits correlation over time,
we say it is autocorrelated or serially
correlated
These terms are used interchangeably
9.3
Serial Correlation
Principles of Econometrics, 4t
h
Edition Page 31
Chapter 9: Regression with Time Series Data:
Stationary Variables


9.3
Serial Correlation
9.3.1
Serial Correlation
in Output Growth
FIGURE 9.5 Scatter diagram for G
t
and G
t-1

Principles of Econometrics, 4t
h
Edition Page 32
Chapter 9: Regression with Time Series Data:
Stationary Variables



Recall that the population correlation between two
variables x and y is given by:
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
( )
( ) ( )
cov ,

var var
xy
x y
x y
=
Principles of Econometrics, 4t
h
Edition Page 33
Chapter 9: Regression with Time Series Data:
Stationary Variables
For the Okuns Law problem, we have:




The notation
1
is used to denote the population
correlation between observations that are one period
apart in time
This is known also as the population
autocorrelation of order one.
The second equality in Eq. 9.12 holds because
var(G
t
) = var(G
t-1
) , a property of time series that
are stationary

9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
( )
( ) ( )
( )
( )
1 1
1
1
cov , cov ,

var
var var
t t t t
t
t t
G G G G
G
G G

= =
Eq. 9.12
Principles of Econometrics, 4t
h
Edition Page 34
Chapter 9: Regression with Time Series Data:
Stationary Variables


The first-order sample autocorrelation for G is
obtained from Eq. 9.12 using the estimates:






9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
( )
( )( )
( )
( )
1 1
2
2
1
1
cov ,
1
1
var
1
T
t t t t
t
T
t t
t
G G G G G G
T
G G G
T

=
=
=

Principles of Econometrics, 4t
h
Edition Page 35
Chapter 9: Regression with Time Series Data:
Stationary Variables


Making the substitutions, we get:
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
Eq. 9.13
( )( )
( )
1
2
1
2
1
T
t t
t
T
t
t
G G G G
r
G G

=
=

=

Principles of Econometrics, 4t
h
Edition Page 36
Chapter 9: Regression with Time Series Data:
Stationary Variables


More generally, the k-th order sample
autocorrelation for a series y that gives the
correlation between observations that are k periods
apart is:
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
Eq. 9.14
( )( )
( )
1
2
1
T
t t k
t k
k
T
t
t
y y y y
r
y y

= +
=

=

Principles of Econometrics, 4t
h
Edition Page 37
Chapter 9: Regression with Time Series Data:
Stationary Variables

Because (T - k) observations are used to compute
the numerator and T observations are used to
compute the denominator, an alternative that leads
to larger estimates in finite samples is:
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
Eq. 9.15
( )( )
( )
1
2
1
1
1
T
t t k
t k
k
T
t
t
y y y y
T k
r
y y
T

= +
=

'
=

Principles of Econometrics, 4t
h
Edition Page 38
Chapter 9: Regression with Time Series Data:
Stationary Variables



Applying this to our problem, we get for the first
four autocorrelations:
9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
Eq. 9.16
1 2 3 4
0.494 0.411 0.154 0.200 r r r r = = = =
Principles of Econometrics, 4t
h
Edition Page 39
Chapter 9: Regression with Time Series Data:
Stationary Variables


How do we test whether an autocorrelation is
significantly different from zero?
The null hypothesis is H
0
:
k
= 0
A suitable test statistic is:

9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
Eq. 9.17
( )
0
0,1
1
k
k
r
Z Tr N
T

= =
Principles of Econometrics, 4t
h
Edition Page 40
Chapter 9: Regression with Time Series Data:
Stationary Variables
For our problem, we have:



We reject the hypotheses H
0
:
1
= 0 and
H
0
:
2
= 0
We have insufficient evidence to reject
H
0
:
3
= 0

4
is on the borderline of being significant.
We conclude that G, the quarterly growth rate
in U.S. GDP, exhibits significant serial
correlation at lags one and two

9.3
Serial Correlation
9.3.1a
Computing
Autocorrelation
1 2
3 4
98 0.494 4.89, 98 0.414 4.10
98 0.154 1.52, 98 0.200 1.98
Z Z
Z Z
= = = =
= = = =
Principles of Econometrics, 4t
h
Edition Page 41
Chapter 9: Regression with Time Series Data:
Stationary Variables


The correlogram, also called the sample
autocorrelation function, is the sequence of
autocorrelations r
1
, r
2
, r
3
,
It shows the correlation between observations
that are one period apart, two periods apart,
three periods apart, and so on

9.3
Serial Correlation
9.3.1b
The Correlagram
Principles of Econometrics, 4t
h
Edition Page 42
Chapter 9: Regression with Time Series Data:
Stationary Variables



9.3
Serial Correlation
9.3.1b
The Correlagram
FIGURE 9.6 Correlogram for G
Principles of Econometrics, 4t
h
Edition Page 43
Chapter 9: Regression with Time Series Data:
Stationary Variables



The correlogram can also be used to check
whether the multiple regression assumption
cov(e
t
, e
s
) = 0 for t s is violated
9.3
Serial Correlation
9.3.2
Serially Correlated
Errors
Principles of Econometrics, 4t
h
Edition Page 44
Chapter 9: Regression with Time Series Data:
Stationary Variables

Consider a model for a Phillips Curve:


If we initially assume that inflationary
expectations are constant over time (
1
= INF
E
t
)
set
2
= -, and add an error term:
9.3
Serial Correlation
9.3.2a
A Phillips Curve
( )
1

E
t t t t
INF INF U U

=
Eq. 9.18
1 2

t t t
INF DU e = + + Eq. 9.19
Principles of Econometrics, 4t
h
Edition Page 45
Chapter 9: Regression with Time Series Data:
Stationary Variables


9.3
Serial Correlation
9.3.2a
A Phillips Curve
FIGURE 9.7 (a) Time series for Australian price inflation
Principles of Econometrics, 4t
h
Edition Page 46
Chapter 9: Regression with Time Series Data:
Stationary Variables


9.3
Serial Correlation
9.3.2a
A Phillips Curve
FIGURE 9.7 (b) Time series for the quarterly change in the Australian
unemployment rate
Principles of Econometrics, 4t
h
Edition Page 47
Chapter 9: Regression with Time Series Data:
Stationary Variables



To determine if the errors are serially correlated,
we compute the least squares residuals:
9.3
Serial Correlation
9.3.2a
A Phillips Curve
Eq. 9.20
1 2

t t t
e INF b b DU = +
Principles of Econometrics, 4t
h
Edition Page 48
Chapter 9: Regression with Time Series Data:
Stationary Variables


9.3
Serial Correlation
9.3.2a
A Phillips Curve
FIGURE 9.8 Correlogram for residuals from least-squares estimated
Phillips curve
Principles of Econometrics, 4t
h
Edition Page 49
Chapter 9: Regression with Time Series Data:
Stationary Variables
The k-th order autocorrelation for the residuals can
be written as:



The least squares equation is:
9.3
Serial Correlation
9.3.2a
A Phillips Curve
1
2
1

T
t t k
t k
k
T
t
t
e e
r
e

= +
=
=

Eq. 9.21
( ) ( ) ( )
0.7776 0.5279
0.0658 0.2294
INF DU
se
=
Eq. 9.22
Principles of Econometrics, 4t
h
Edition Page 50
Chapter 9: Regression with Time Series Data:
Stationary Variables



The values at the first five lags are:
9.3
Serial Correlation
9.3.2a
A Phillips Curve
1 2 3 4 5
0.549 0.456 0.433 0.420 0.339 r r r r r = = = = =
Principles of Econometrics, 4t
h
Edition Page 51
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.4
Other Tests for Serially Correlated
Errors
Principles of Econometrics, 4t
h
Edition Page 52
Chapter 9: Regression with Time Series Data:
Stationary Variables




An advantage of this test is that it readily
generalizes to a joint test of correlations at more
than one lag
9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
Principles of Econometrics, 4t
h
Edition Page 53
Chapter 9: Regression with Time Series Data:
Stationary Variables


If e
t
and e
t-1
are correlated, then one way to model
the relationship between them is to write:

We can substitute this into a simple regression
equation:

9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
1

t t t
e e v

= + Eq. 9.23
1 2 1

t t t t
y x e v

= + + + Eq. 9.24
Principles of Econometrics, 4t
h
Edition Page 54
Chapter 9: Regression with Time Series Data:
Stationary Variables


We have one complication: is unknown
Two ways to handle this are:
1. Delete the first observation and use a total
of T observations
2. Set and use all T observations

9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
0

e
0

0 e =
Principles of Econometrics, 4t
h
Edition Page 55
Chapter 9: Regression with Time Series Data:
Stationary Variables

For the Phillips Curve:



The results are almost identical
The null hypothesis H
0
: = 0 is rejected at all
conventional significance levels
We conclude that the errors are serially
correlated

9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
( )
( )
i 6.219 38.67 -value 0.000
ii 6.202 38.47 -value 0.000
t F p
t F p
= = =
= = =
Principles of Econometrics, 4t
h
Edition Page 56
Chapter 9: Regression with Time Series Data:
Stationary Variables
To derive the relevant auxiliary regression for the
autocorrelation LM test, we write the test equation
as:


But since we know that , we
get:

9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
1 2 1


t t t t
y x e v

= + + +
Eq. 9.25
1 2

t t t
y b b x e = + +
1 2 1 2 1


t t t t t
b b x e x e v

+ + = + + +
Principles of Econometrics, 4t
h
Edition Page 57
Chapter 9: Regression with Time Series Data:
Stationary Variables

Rearranging, we get:



If H
0
: = 0 is true, then LM = T x R
2
has an
approximate
2
(1)
distribution
T and R
2
are the sample size and goodness-
of-fit statistic, respectively, from least
squares estimation of Eq. 9.26

9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
( ) ( )
1 1 2 2 1
1 2 1


t t t t
t t
e b b x e v
x e v

= + + +
= + + +
Eq. 9.26
Principles of Econometrics, 4t
h
Edition Page 58
Chapter 9: Regression with Time Series Data:
Stationary Variables
Considering the two alternative ways to handle :



These values are much larger than 3.84, which
is the 5% critical value from a
2
(1)
-distribution
We reject the null hypothesis of no
autocorrelation
Alternatively, we can reject H
0
by examining
the p-value for LM = 27.61, which is 0.000


9.4
Other Tests for
Serially Correlated
Errors
9.4.1
A Lagrange
Multiplier Test
0

e
( ) ( )
( )
2
2
iii 1 89 0.3102 27.61
iv 90 0.3066 27.59
LM T R
LM T R
= = =
= = =
Principles of Econometrics, 4t
h
Edition Page 59
Chapter 9: Regression with Time Series Data:
Stationary Variables


For a four-period lag, we obtain:



Because the 5% critical value from a
2
(4)-
distribution is 9.49, these LM values lead us to
conclude that the errors are serially correlated

9.4
Other Tests for
Serially Correlated
Errors
9.4.1a
Testing Correlation
at Longer Lags
( ) ( )
( )
2
2
iii 4 86 0.3882 33.4
iv 90 0.4075 36.7
LM T R
LM T R
= = =
= = =
Principles of Econometrics, 4t
h
Edition Page 60
Chapter 9: Regression with Time Series Data:
Stationary Variables


This is used less frequently today because its
critical values are not available in all software
packages, and one has to examine upper and lower
critical bounds instead
Also, unlike the LM and correlogram tests, its
distribution no longer holds when the equation
contains a lagged dependent variable
9.4
Other Tests for
Serially Correlated
Errors
9.4.2
The Durbin-
Watson Test
Principles of Econometrics, 4t
h
Edition Page 61
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.5
Estimation with Serially Correlated
Errors
Principles of Econometrics, 4t
h
Edition Page 62
Chapter 9: Regression with Time Series Data:
Stationary Variables

Three estimation procedures are considered:
1. Least squares estimation
2. An estimation procedure that is relevant when
the errors are assumed to follow what is
known as a first-order autoregressive model

3. A general estimation strategy for estimating
models with serially correlated errors

9.5
Estimation with
Serially Correlated
Errors
1

t t t
e e v

= +
Principles of Econometrics, 4t
h
Edition Page 63
Chapter 9: Regression with Time Series Data:
Stationary Variables



We will encounter models with a lagged
dependent variable, such as:
9.5
Estimation with
Serially Correlated
Errors
1 1 0 1 1

t t t t t
y y x x v

= + + + +
Principles of Econometrics, 4t
h
Edition Page 64
Chapter 9: Regression with Time Series Data:
Stationary Variables
TSMR2A In the multiple regression model
Where some of the x
tk
may be lagged values of y, v
t
is uncorrelated with all
x
tk
and their past values.
1 2 2

t t K K t
y x x v = + + + +
ASSUMPTION FOR MODELS WITH A LAGGED DEPENDENT VARIABLE
9.5
Estimation with
Serially Correlated
Errors
Principles of Econometrics, 4t
h
Edition Page 65
Chapter 9: Regression with Time Series Data:
Stationary Variables
Suppose we proceed with least squares estimation
without recognizing the existence of serially
correlated errors. What are the consequences?
1. The least squares estimator is still a linear
unbiased estimator, but it is no longer best
2. The formulas for the standard errors usually
computed for the least squares estimator are
no longer correct
Confidence intervals and hypothesis tests
that use these standard errors may be
misleading
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
Principles of Econometrics, 4t
h
Edition Page 66
Chapter 9: Regression with Time Series Data:
Stationary Variables


It is possible to compute correct standard errors
for the least squares estimator:
HAC (heteroskedasticity and autocorrelation
consistent) standard errors, or Newey-West
standard errors
These are analogous to the heteroskedasticity
consistent standard errors
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
Principles of Econometrics, 4t
h
Edition Page 67
Chapter 9: Regression with Time Series Data:
Stationary Variables
Consider the model y
t
=
1
+
2
x
t
+ e
t

The variance of b
2
is:





where
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
( ) ( ) ( )
( )
( )
( )
2
2
2
2
var var cov ,
cov ,
var 1
var
t t t s t s
t t s
t s t s
t s
t t
t
t t
t
b w e ww e e
ww e e
w e
w e
=
=
= +
(
(
= +
(
(


( ) ( )
2
t t t
t
w x x x x =

Eq. 9.27
Principles of Econometrics, 4t
h
Edition Page 68
Chapter 9: Regression with Time Series Data:
Stationary Variables
When the errors are not correlated, cov(e
t
, e
s
) = 0,
and the term in square brackets is equal to one.
The resulting expression


is the one used to find heteroskedasticity-
consistent (HC) standard errors
When the errors are correlated, the term in
square brackets is estimated to obtain HAC
standard errors
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
( ) ( )
2
2
var var
t t
t
b w e =

Principles of Econometrics, 4t
h
Edition Page 69
Chapter 9: Regression with Time Series Data:
Stationary Variables


If we call the quantity in square brackets g and its
estimate , then the relationship between the two
estimated variances is:
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
( ) ( )
2 2

var var
HAC HC
b b g =

g
Eq. 9.28
Principles of Econometrics, 4t
h
Edition Page 70
Chapter 9: Regression with Time Series Data:
Stationary Variables


Lets reconsider the Phillips Curve model:
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
( ) ( ) ( )
( ) ( ) ( )
0.7776 0.5279
0.0658 0.2294 incorrect se
0.1030 0.3127 HAC se
INF DU =
Eq. 9.29
Principles of Econometrics, 4t
h
Edition Page 71
Chapter 9: Regression with Time Series Data:
Stationary Variables



The t and p-values for testing H
0
:
2
= 0 are:
9.5
Estimation with
Serially Correlated
Errors
9.5.1
Least Squares
Estimation
( )
( )
0.5279 0.2294 2.301 0.0238 from LS standard errors
0.5279 0.3127 1.688 0.0950 from HAC standard errors
t p
t p
= = =
= = =
Principles of Econometrics, 4t
h
Edition Page 72
Chapter 9: Regression with Time Series Data:
Stationary Variables

Return to the Lagrange multiplier test for serially
correlated errors where we used the equation:


Assume the v
t
are uncorrelated random errors
with zero mean and constant variances:
9.5
Estimation with
Serially Correlated
Errors
9.5.2
Estimating an
AR(1) Error Model
1

t t t
e e v

= +
Eq. 9.30
( ) ( ) ( )
2
0 var cov , 0 for
t t v t s
E v v v v t s o = = = = Eq. 9.31
Principles of Econometrics, 4t
h
Edition Page 73
Chapter 9: Regression with Time Series Data:
Stationary Variables
Eq. 9.30 describes a first-order autoregressive
model or a first-order autoregressive process for
e
t

The term AR(1) model is used as an
abbreviation for first-order autoregressive
model
It is called an autoregressive model because it
can be viewed as a regression model
It is called first-order because the right-hand-
side variable is e
t
lagged one period
9.5
Estimation with
Serially Correlated
Errors
9.5.2
Estimating an
AR(1) Error Model
Principles of Econometrics, 4t
h
Edition Page 74
Chapter 9: Regression with Time Series Data:
Stationary Variables
We assume that:


The mean and variance of e
t
are:


The covariance term is:

9.5
Estimation with
Serially Correlated
Errors
9.5.2a
Properties of an
AR(1) Error
1 1 < < Eq. 9.32
( ) ( )
2
2
2
0 var
1
v
t t e
E e e
o
o = = =

Eq. 9.33
( )
2
2

cov , , 0
1
k
v
t t k
e e k
o

= >

Eq. 9.34
Principles of Econometrics, 4t
h
Edition Page 75
Chapter 9: Regression with Time Series Data:
Stationary Variables
The correlation implied by the covariance is:
9.5
Estimation with
Serially Correlated
Errors
( )
( )
( ) ( )
( )
( )
( )
( )
2 2
2 2
corr ,
cov ,
var var
cov ,
var
1
1

k t t k
t t k
t t k
t t k
t
k
v
v
k
e e
e e
e e
e e
e
o
o

=
=
=

=
Eq. 9.35
9.5.2a
Properties of an
AR(1) Error
Principles of Econometrics, 4t
h
Edition Page 76
Chapter 9: Regression with Time Series Data:
Stationary Variables
Setting k = 1:


represents the correlation between two errors that
are one period apart
It is the first-order autocorrelation for e,
sometimes simply called the autocorrelation
coefficient
It is the population autocorrelation at lag one for
a time series that can be described by an AR(1)
model
r
1
is an estimate for when we assume a series
is AR(1)

9.5
Estimation with
Serially Correlated
Errors
( )
1 1
corr ,
t t
e e

= =
Eq. 9.36
9.5.2a
Properties of an
AR(1) Error
Principles of Econometrics, 4t
h
Edition Page 77
Chapter 9: Regression with Time Series Data:
Stationary Variables
Each e
t
depends on all past values of the errors v
t
:


For the Phillips Curve, we find for the first five
lags:


For an AR(1) model, we have:
9.5
Estimation with
Serially Correlated
Errors
2 3
1 2 3

t t t t t
e v v v v

= + + + + Eq. 9.37
1 2 3 4 5
0.549 0.456 0.433 0.420 0.339 r r r r r = = = = =
1 1

0.549 r = = =
9.5.2a
Properties of an
AR(1) Error
Principles of Econometrics, 4t
h
Edition Page 78
Chapter 9: Regression with Time Series Data:
Stationary Variables


For longer lags, we have:
9.5
Estimation with
Serially Correlated
Errors
( )
( )
( )
( )
2
2
2
3
3
3
4
4
4
5
5
5

0.549 0.301

0.549 0.165

0.549 0.091

0.549 0.050
= = =
= = =
= = =
= = =
9.5.2a
Properties of an
AR(1) Error
Principles of Econometrics, 4t
h
Edition Page 79
Chapter 9: Regression with Time Series Data:
Stationary Variables

Our model with an AR(1) error is:


with -1 < < 1
For the v
t
, we have:
9.5
Estimation with
Serially Correlated
Errors
9.5.2b
Nonlinear Least
Squares Estimation
1 2 1
with
t t t t t t
y x e e e v

= + + = + Eq. 9.38
( ) ( ) ( )
2
1
0 var cov , 0 for
t t v t t
E v v v v t s o

= = = = Eq. 9.39
Principles of Econometrics, 4t
h
Edition Page 80
Chapter 9: Regression with Time Series Data:
Stationary Variables
With the appropriate substitutions, we get:


For the previous period, the error is:


Multiplying by :
9.5
Estimation with
Serially Correlated
Errors
9.5.2b
Nonlinear Least
Squares Estimation
1 2 1

t t t t
y x e v

= + + + Eq. 9.40
1 1 1 2 1

t t t
e y x

= Eq. 9.41
1 1 1 2 1

t t t t
e e y x

= Eq. 9.42
Principles of Econometrics, 4t
h
Edition Page 81
Chapter 9: Regression with Time Series Data:
Stationary Variables



Substituting, we get:
9.5
Estimation with
Serially Correlated
Errors
9.5.2b
Nonlinear Least
Squares Estimation
( )
1 2 1 2 1
1
t t t t t
y x y x v

= + + + Eq. 9.43
Principles of Econometrics, 4t
h
Edition Page 82
Chapter 9: Regression with Time Series Data:
Stationary Variables

The coefficient of x
t-1
equals -
2

Although Eq. 9.43 is a linear function of the
variables x
t
, y
t-1
and x
t-1
, it is not a linear
function of the parameters (
1
,
2
, )
The usual linear least squares formulas cannot
be obtained by using calculus to find the values
of (
1
,
2
, ) that minimize S
v
These are nonlinear least squares estimates
9.5
Estimation with
Serially Correlated
Errors
9.5.2b
Nonlinear Least
Squares Estimation
Principles of Econometrics, 4t
h
Edition Page 83
Chapter 9: Regression with Time Series Data:
Stationary Variables
Our Phillips Curve model assuming AR(1) errors
is:


Applying nonlinear least squares and presenting
the estimates in terms of the original
untransformed model, we have:

9.5
Estimation with
Serially Correlated
Errors
9.5.2b
Nonlinear Least
Squares Estimation
( )
1 2 1 2 1
1
t t t t t
INF DU INF DU v

= + + + Eq. 9.44
( ) ( ) ( ) ( )
1
0.7609 0.6944 0.557
0.1245 0.2479 0.090
t t t
INF DU e e v
se

= = +
Eq. 9.45
Principles of Econometrics, 4t
h
Edition Page 84
Chapter 9: Regression with Time Series Data:
Stationary Variables



Nonlinear least squares estimation of Eq. 9.43 is
equivalent to using an iterative generalized least
squares estimator called the Cochrane-Orcutt
procedure
9.5
Estimation with
Serially Correlated
Errors
9.5.2c
Generalized Least
Squares Estimation
Principles of Econometrics, 4t
h
Edition Page 85
Chapter 9: Regression with Time Series Data:
Stationary Variables
We have the model:


Suppose now that we consider the model:


This new notation will be convenient when
we discuss a general class of autoregressive
distributed lag (ARDL) models
Eq. 9.47 is a member of this class


9.5
Estimation with
Serially Correlated
Errors
9.5.3
Estimating a More
General Model
( )
1 2 1 2 1
1
t t t t t
y x y x v

= + + + Eq. 9.46
1 1 0 1 1

t t t t t
y y x x v

= + + + + Eq. 9.47
Principles of Econometrics, 4t
h
Edition Page 86
Chapter 9: Regression with Time Series Data:
Stationary Variables


Note that Eq. 9.47 is the same as Eq. 9.47 since:


Eq. 9.46 is a restricted version of Eq. 9.47 with
the restriction
1
= -
1

0
imposed


9.5
Estimation with
Serially Correlated
Errors
9.5.3
Estimating a More
General Model
( )
1 0 2 1 2 1
1 = = = = Eq. 9.48
Principles of Econometrics, 4t
h
Edition Page 87
Chapter 9: Regression with Time Series Data:
Stationary Variables


Applying the least squares estimator to Eq. 9.47
using the data for the Phillips curve example
yields:


9.5
Estimation with
Serially Correlated
Errors
9.5.3
Estimating a More
General Model
( ) ( ) ( ) ( ) ( )
1 1
0.3336 0.5593 0.6882 0.3200
0.0899 0.0908 0.2575 0.2499
t
t t t
INF INF DU DU
se

= + +
Eq. 9.49
Principles of Econometrics, 4t
h
Edition Page 88
Chapter 9: Regression with Time Series Data:
Stationary Variables

The equivalent AR(1) estimates are:





These are similar to our other estimates


9.5
Estimation with
Serially Correlated
Errors
9.5.3
Estimating a More
General Model
( ) ( )
( )
1
1
0 2
1 2

1 0.7609 1 0.5574 0.3368

0.5574

0.6944

0.5574 0.6944 0.3871


= = =
= =
= =
= = =
Principles of Econometrics, 4t
h
Edition Page 89
Chapter 9: Regression with Time Series Data:
Stationary Variables

The original economic model for the Phillips
Curve was:


Re-estimation of the model after omitting DU
t-1

yields:




9.5
Estimation with
Serially Correlated
Errors
9.5.3
Estimating a More
General Model
( )
1

E
t t t t
INF INF U U

= Eq. 9.50
( ) ( ) ( ) ( )
1
0.3548 0.5282 0.4909
0.0876 0.0851 0.1921
t
t t
INF INF DU
se

= +
Eq. 9.51
Principles of Econometrics, 4t
h
Edition Page 90
Chapter 9: Regression with Time Series Data:
Stationary Variables


In this model inflationary expectations are given
by:


A 1% rise in the unemployment rate leads to an
approximate 0.5% fall in the inflation rate



9.5
Estimation with
Serially Correlated
Errors
9.5.3
Estimating a More
General Model
1
0.3548 0.5282
E
t t
INF INF

= +
Principles of Econometrics, 4t
h
Edition Page 91
Chapter 9: Regression with Time Series Data:
Stationary Variables
We have described three ways of overcoming the
effect of serially correlated errors:
1. Estimate the model using least squares with
HAC standard errors
2. Use nonlinear least squares to estimate the
model with a lagged x, a lagged y, and the
restriction implied by an AR(1) error
specification
3. Use least squares to estimate the model with a
lagged x and a lagged y, but without the
restriction implied by an AR(1) error
specification



9.5
Estimation with
Serially Correlated
Errors
9.5.4
Summary of
Section 9.5 and
Looking Ahead
Principles of Econometrics, 4t
h
Edition Page 92
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.6
Autoregressive Distributed Lag
Models
Principles of Econometrics, 4t
h
Edition Page 93
Chapter 9: Regression with Time Series Data:
Stationary Variables

An autoregressive distributed lag (ARDL) model
is one that contains both lagged x
t
s and lagged y
t
s


Two examples:
9.6
Autoregressive
Distributed Lag
Models
0 1 1 1 1 t t t q t q t p t p t
y x x x y y v

= o +o +o + +o +u + +u +
Eq. 9.52
( )
( )
1 1
1
ADRL 1,1 : 0.3336 0.5593 0.6882 0.3200
ADRL 1, 0 : 0.3548 0.5282 0.4909
t t t t
t t t
INF INF DU DU
INF INF DU

= + +
= +
Principles of Econometrics, 4t
h
Edition Page 94
Chapter 9: Regression with Time Series Data:
Stationary Variables

An ARDL model can be transformed into one with
only lagged xs which go back into the infinite
past:




This model is called an infinite distributed
lag model
9.6
Autoregressive
Distributed Lag
Models
0 1 1 2 2 3 3
0

t t t t t t
s t s t
s
y x x x x e
x e

=
= o+| + + + + +
= o+ +

Eq. 9.53
Principles of Econometrics, 4t
h
Edition Page 95
Chapter 9: Regression with Time Series Data:
Stationary Variables
Four possible criteria for choosing p and q:
1. Has serial correlation in the errors been
eliminated?
2. Are the signs and magnitudes of the estimates
consistent with our expectations from
economic theory?
3. Are the estimates significantly different from
zero, particularly those at the longest lags?
4. What values for p and q minimize information
criteria such as the AIC and SC?


9.6
Autoregressive
Distributed Lag
Models
Principles of Econometrics, 4t
h
Edition Page 96
Chapter 9: Regression with Time Series Data:
Stationary Variables
The Akaike information criterion (AIC) is:


where K = p + q + 2
The Schwarz criterion (SC), also known as the
Bayes information criterion (BIC), is:


Because Kln(T)/T > 2K/T for T 8, the SC
penalizes additional lags more heavily than
does the AIC
9.6
Autoregressive
Distributed Lag
Models
2
AIC ln
SSE K
T T
| |
= +
|
\ .
Eq. 9.54
( )
ln
SC ln
K T
SSE
T T
| |
= +
|
\ .
Eq. 9.55
Principles of Econometrics, 4t
h
Edition Page 97
Chapter 9: Regression with Time Series Data:
Stationary Variables


Consider the previously estimated ARDL(1,0)
model:
9.6
Autoregressive
Distributed Lag
Models
Eq. 9.56
9.6.1
The Phillips Curve
( ) ( ) ( ) ( )
1
0.3548 0.5282 0.4909 , obs 90
0.0876 0.0851 0.1921
t
t t
INF INF DU
se

= + =
Principles of Econometrics, 4t
h
Edition Page 98
Chapter 9: Regression with Time Series Data:
Stationary Variables


9.6
Autoregressive
Distributed Lag
Models
9.6.1
The Phillips Curve
FIGURE 9.9 Correlogram for residuals from Phillips curve ARDL(1,0) model
Principles of Econometrics, 4t
h
Edition Page 99
Chapter 9: Regression with Time Series Data:
Stationary Variables


9.6
Autoregressive
Distributed Lag
Models
9.6.1
The Phillips Curve
Table 9.3 p-values for LM Test for Autocorrelation
Principles of Econometrics, 4t
h
Edition Page 100
Chapter 9: Regression with Time Series Data:
Stationary Variables


For an ARDL(4,0) version of the model:
9.6
Autoregressive
Distributed Lag
Models
Eq. 9.57
9.6.1
The Phillips Curve
( ) ( ) ( ) ( ) ( )
( ) ( )
1 2 3
-4
0.1001 0.2354 0.1213 0.1677
0.0983 0.1016 0.1038 0.1050
0.2819 0.7902
0.1014 0.1885
t
t t t
t t
INF INF INF INF
se
INF DU

= + + +
+
obs 87 =
Principles of Econometrics, 4t
h
Edition Page 101
Chapter 9: Regression with Time Series Data:
Stationary Variables


Inflationary expectations are given by:
9.6
Autoregressive
Distributed Lag
Models
9.6.1
The Phillips Curve
1 2 3 -4
0.1001 0.2354 0.1213 0.1677 0.2819
E
t t t t t
INF INF INF INF INF

= + + + +
Principles of Econometrics, 4t
h
Edition Page 102
Chapter 9: Regression with Time Series Data:
Stationary Variables


9.6
Autoregressive
Distributed Lag
Models
9.6.1
The Phillips Curve
Table 9.4 AIC and SC Values for Phillips Curve ARDL Models
Principles of Econometrics, 4t
h
Edition Page 103
Chapter 9: Regression with Time Series Data:
Stationary Variables



Recall the model for Okuns Law:
9.6
Autoregressive
Distributed Lag
Models
9.6.2
Okuns Law
Eq. 9.58
( ) ( ) ( ) ( ) ( )
1 2
0.5836 0.2020 0.1653 0.0700G , obs 96
0.0472 0.0324 0.0335 0.0331
t
t t t
DU G G
se

= =
Principles of Econometrics, 4t
h
Edition Page 104
Chapter 9: Regression with Time Series Data:
Stationary Variables


9.6
Autoregressive
Distributed Lag
Models
FIGURE 9.10 Correlogram for residuals from Okuns law ARDL(0,2) model
9.6.2
Okuns Law
Principles of Econometrics, 4t
h
Edition Page 105
Chapter 9: Regression with Time Series Data:
Stationary Variables


9.6
Autoregressive
Distributed Lag
Models
Table 9.5 AIC and SC Values for Okuns Law ARDL Models
9.6.2
Okuns Law
Principles of Econometrics, 4t
h
Edition Page 106
Chapter 9: Regression with Time Series Data:
Stationary Variables



Now consider this version:
9.6
Autoregressive
Distributed Lag
Models
9.6.2
Okuns Law
Eq. 9.59
( ) ( )( ) ( ) ( )
1 1
0.3780 0.3501 0.1841 0.0992G , obs 96
0.0578 0.0846 0.0307 0.0368
t
t t t
DU DU G
se

= + =
Principles of Econometrics, 4t
h
Edition Page 107
Chapter 9: Regression with Time Series Data:
Stationary Variables



An autoregressive model of order p, denoted
AR(p), is given by:
9.6
Autoregressive
Distributed Lag
Models
9.6.3
Autoregressive
Models
Eq. 9.60
1 1 2 2

t t t p t p t
y y y y v

= + + + + +
Principles of Econometrics, 4t
h
Edition Page 108
Chapter 9: Regression with Time Series Data:
Stationary Variables



Consider a model for growth in real GDP:
9.6
Autoregressive
Distributed Lag
Models
9.6.3
Autoregressive
Models
Eq. 9.61
( )( ) ( ) ( )
1 2
0.4657 0.3770 0.2462
0.1433 0.1000 0.1029 obs = 96
t
t t
G G G
se

= + +
Principles of Econometrics, 4t
h
Edition Page 109
Chapter 9: Regression with Time Series Data:
Stationary Variables




9.6
Autoregressive
Distributed Lag
Models
9.6.3
Autoregressive
Models
FIGURE 9.11 Correlogram for residuals from AR(2) model for GDP growth
Principles of Econometrics, 4t
h
Edition Page 110
Chapter 9: Regression with Time Series Data:
Stationary Variables




9.6
Autoregressive
Distributed Lag
Models
9.6.3
Autoregressive
Models
Table 9.6 AIC and SC Values for AR Model of Growth in U.S. GDP
Principles of Econometrics, 4t
h
Edition Page 111
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.7
Forecasting
Principles of Econometrics, 4t
h
Edition Page 112
Chapter 9: Regression with Time Series Data:
Stationary Variables


We consider forecasting using three different
models:
1. AR model
2. ARDL model
3. Exponential smoothing model
9.7
Forecasting
Principles of Econometrics, 4t
h
Edition Page 113
Chapter 9: Regression with Time Series Data:
Stationary Variables


Consider an AR(2) model for real GDP growth:


The model to forecast G
T+1
is:
9.7
Forecasting
9.7.1
Forecasting with
an AR Model
Eq. 9.62
1 1 2 2

t t t t
G G G v

= + + +
1 1 2 1 1

T T T T
G G G v
+ +
= + + +
Principles of Econometrics, 4t
h
Edition Page 114
Chapter 9: Regression with Time Series Data:
Stationary Variables

The growth values for the two most recent
quarters are:
G
T
= G
2009Q3
= 0.8
G
T-1
= G
2009Q2
= -0.2
The forecast for G
2009Q4
is:
9.7
Forecasting
9.7.1
Forecasting with
an AR Model
Eq. 9.63
( )
1 1 2 1


0.46573 0.37700 0.8 0.24624 0.2
0.7181
T T T
G G G
+
= + +
= + +
=
Principles of Econometrics, 4t
h
Edition Page 115
Chapter 9: Regression with Time Series Data:
Stationary Variables
For two quarters ahead, the forecast for G
2010Q1
is:




For three periods out, it is:
9.7
Forecasting
9.7.1
Forecasting with
an AR Model
Eq. 9.64
2 1 1 2


0.46573 0.37700 0.71808 0.24624 0.8
0.9334
T T T
G G G
+ +
= + +
= + +
=
Eq. 9.65
3 1 2 2 1


0.46573 0.37700 0.93343 0.24624 0.71808
0.9945
T T T
G G G
+ + +
= + +
= + +
=
Principles of Econometrics, 4t
h
Edition Page 116
Chapter 9: Regression with Time Series Data:
Stationary Variables



Summarizing our forecasts:
Real GDP growth rates for 2009Q4, 2010Q1,
and 2010Q2 are approximately 0.72%, 0.93%,
and 0.99%, respectively

9.7
Forecasting
9.7.1
Forecasting with
an AR Model
Principles of Econometrics, 4t
h
Edition Page 117
Chapter 9: Regression with Time Series Data:
Stationary Variables


A 95% interval forecast for j periods into the
future is given by:


where is the standard error of the forecast
error and df is the number of degrees of freedom
in the estimation of the AR model

9.7
Forecasting
9.7.1
Forecasting with
an AR Model
( )
0.975,

T j j
df
G t
+

j
Principles of Econometrics, 4t
h
Edition Page 118
Chapter 9: Regression with Time Series Data:
Stationary Variables


The first forecast error, occurring at time T+1, is:


Ignoring the error from estimating the coefficients,
we get:

9.7
Forecasting
9.7.1
Forecasting with
an AR Model
( ) ( ) ( )
1 1 1 1 1 2 2 1 1


T T T T T
u G G G G v
+ + +
= = + + +
1 1 T
u v
+
= Eq. 9.66
Principles of Econometrics, 4t
h
Edition Page 119
Chapter 9: Regression with Time Series Data:
Stationary Variables


The forecast error for two periods ahead is:


The forecast error for three periods ahead is:

9.7
Forecasting
9.7.1
Forecasting with
an AR Model
( )
2 1 1 1 2 1 1 2 1 1 2


T T T T T T
u G G v u v v v
+ + + + + +
= + = + = +
Eq. 9.67
( )
2
3 1 2 2 1 3 1 2 1 1 2 3

T T T T
u u u v v v v
+ + + +
= + + = + + +
Eq. 9.68
Principles of Econometrics, 4t
h
Edition Page 120
Chapter 9: Regression with Time Series Data:
Stationary Variables


Because the v
t
s are uncorrelated with constant
variance , we can show that:

9.7
Forecasting
9.7.1
Forecasting with
an AR Model
( )
( )
( )
( )
( )
( )
2 2
1 1
2 2 2
2 2 1
2
2 2 2 2
3 3 1 2 1
var
var 1
var 1
v
v
v
u
u
u
= =
= = +
= = + + +
2
v
o
Principles of Econometrics, 4t
h
Edition Page 121
Chapter 9: Regression with Time Series Data:
Stationary Variables



9.7
Forecasting
9.7.1
Forecasting with
an AR Model
Table 9.7 Forecasts and Forecast Intervals for GDP Growth
Principles of Econometrics, 4t
h
Edition Page 122
Chapter 9: Regression with Time Series Data:
Stationary Variables

Consider forecasting future unemployment using
the Okuns Law ARDL(1,1):


The value of DU in the first post-sample quarter
is:


But we need a value for G
T+1

9.7
Forecasting
9.7.2
Forecasting with
an ARDL Model
1 1 0 1 1

t t t t t
DU DU G G v

= + + + + Eq. 9.69
1 1 0 1 1 1

T T T T T
DU DU G G v
+ + +
= + + + + Eq. 9.70
Principles of Econometrics, 4t
h
Edition Page 123
Chapter 9: Regression with Time Series Data:
Stationary Variables

Now consider the change in unemployment
Rewrite Eq. 9.70 as:


Rearranging:
9.7
Forecasting
9.7.2
Forecasting with
an ARDL Model
( )
1 1 1 0 1 1 1

T T T T T T T
U U U U G G v
+ + +
= + + + +
Eq. 9.71
( )
1 1 1 1 0 1 1 1
* *
1 2 1 0 1 1 1
1

T T T T T T
T T T T T
U U U G G v
U U G G v
+ + +
+ +
= + + + + +
= + + + + +
Principles of Econometrics, 4t
h
Edition Page 124
Chapter 9: Regression with Time Series Data:
Stationary Variables


For the purpose of computing point and interval
forecasts, the ARDL(1,1) model for a change in
unemployment can be written as an ARDL(2,1)
model for the level of unemployment
This result holds not only for ARDL models
where a dependent variable is measured in
terms of a change or difference, but also for
pure AR models involving such variables
9.7
Forecasting
9.7.2
Forecasting with
an ARDL Model
Principles of Econometrics, 4t
h
Edition Page 125
Chapter 9: Regression with Time Series Data:
Stationary Variables


Another popular model used for predicting the
future value of a variable on the basis of its history
is the exponential smoothing method
Like forecasting with an AR model, forecasting
using exponential smoothing does not utilize
information from any other variable
9.7
Forecasting
9.7.3
Exponential
Smoothing
Principles of Econometrics, 4t
h
Edition Page 126
Chapter 9: Regression with Time Series Data:
Stationary Variables

One possible forecasting method is to use the
average of past information, such as:



This forecasting rule is an example of a simple
(equally-weighted) moving average model with
k = 3
9.7
Forecasting
9.7.3
Exponential
Smoothing
1 2
1

3
T T T
T
y y y
y

+
+ +
=
Principles of Econometrics, 4t
h
Edition Page 127
Chapter 9: Regression with Time Series Data:
Stationary Variables

Now consider a form in which the weights decline
exponentially as the observations get older:



We assume that 0 < < 1
Also, it can be shown that:
9.7
Forecasting
9.7.3
Exponential
Smoothing
( ) ( )
1 2
1 T 1 2

y 1 1
T T T
y y y
+
= + + +
Eq. 9.72
( )
0
1 1
s
s

=
=

Principles of Econometrics, 4t
h
Edition Page 128
Chapter 9: Regression with Time Series Data:
Stationary Variables

For forecasting, recognize that:



We can simplify to:
9.7
Forecasting
9.7.3
Exponential
Smoothing
( ) ( ) ( ) ( )
2 3
1 2 3

1 1 1 1
T T T T
y y y y

= + + +
Eq. 9.73
( )
1

1
T T T
y y y
+
= + Eq. 9.74
Principles of Econometrics, 4t
h
Edition Page 129
Chapter 9: Regression with Time Series Data:
Stationary Variables
The value of can reflect ones judgment about
the relative weight of current information
It can be estimated from historical information
by obtaining within-sample forecasts:


Choosing that minimizes the sum of
squares of the one-step forecast errors:

9.7
Forecasting
9.7.3
Exponential
Smoothing
( )
1 1

1 2,3, ,
t t t
y y y t T

= + = Eq. 9.75
( )
( )
1 1

+ 1
t t t t t t
v y y y y y

= = Eq. 9.76
Principles of Econometrics, 4t
h
Edition Page 130
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.7
Forecasting
9.7.3
Exponential
Smoothing
FIGURE 9.12 (a) Exponentially smoothed forecasts for GDP growth with
= 0.38
Principles of Econometrics, 4t
h
Edition Page 131
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.7
Forecasting
9.7.3
Exponential
Smoothing
FIGURE 9.12 (b) Exponentially smoothed forecasts for GDP growth with
= 0.8
Principles of Econometrics, 4t
h
Edition Page 132
Chapter 9: Regression with Time Series Data:
Stationary Variables


The forecasts for 2009Q4 from each value of
are:

9.7
Forecasting
9.7.3
Exponential
Smoothing
( ) ( ) ( )
( ) ( ) ( )
1
1

0.38: 1 0.38 0.8 1 0.38 0.403921
= 0.0536

0.8: 1 0.8 0.8 1 0.8 0.393578
= 0.5613
T T T
T T T
G G G
G G G
+
+
= = + = +
= = + = +
Principles of Econometrics, 4t
h
Edition Page 133
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.8
Multiplier Analysis
Principles of Econometrics, 4t
h
Edition Page 134
Chapter 9: Regression with Time Series Data:
Stationary Variables



Multiplier analysis refers to the effect, and the
timing of the effect, of a change in one variable on
the outcome of another variable

9.8
Multiplier Analysis
Principles of Econometrics, 4t
h
Edition Page 135
Chapter 9: Regression with Time Series Data:
Stationary Variables

Lets find multipliers for an ARDL model of the
form:


We can transform this into an infinite
distributed lag model:


9.8
Multiplier Analysis
1 1 0 1 1 t t p t p t t q t q t
y y y x x x v

= o + u + + u + o + o + + o +
Eq. 9.77
0 t 1 1 2 2 3 3
x +
t t t t t
y x x x e

= + + + + + Eq. 9.78
Principles of Econometrics, 4t
h
Edition Page 136
Chapter 9: Regression with Time Series Data:
Stationary Variables

The multipliers are defined as:

9.8
Multiplier Analysis
0
0
period delay multiplier
period interim multiplier
total multiplier
t
s
t s
s
j
j
j
j
y
s
x
s

=
c
= =
c
=
=

Principles of Econometrics, 4t
h
Edition Page 137
Chapter 9: Regression with Time Series Data:
Stationary Variables

The lag operator is defined as:

Lagging twice, we have:

Or:

More generally, we have:




9.8
Multiplier Analysis
1 t t
Ly y

=
( )
1 2 t t t
L Ly Ly y

= =
2
2 t t
L y y

=
s
t t s
L y y

=
Principles of Econometrics, 4t
h
Edition Page 138
Chapter 9: Regression with Time Series Data:
Stationary Variables


Now rewrite our model as:
9.8
Multiplier Analysis
2 2
1 2 0 1 2

p
t t t p t t t t
q
q t t
y Ly L y L y x Lx L x
L x v
= o + u + u + + u + o + o + o
+ + o +
Eq. 9.79
Principles of Econometrics, 4t
h
Edition Page 139
Chapter 9: Regression with Time Series Data:
Stationary Variables



Rearranging terms:
9.8
Multiplier Analysis
( ) ( )
2 2
1 2 0 1 2
1
p q
p t q t t
L L L y L L L x v u u u =o+ o +o +o + +o + Eq. 9.80
Principles of Econometrics, 4t
h
Edition Page 140
Chapter 9: Regression with Time Series Data:
Stationary Variables


Lets apply this to our Okuns Law model
The model:


can be rewritten as:
9.8
Multiplier Analysis
Eq. 9.81
1 1 0 1 1

t t t t t
DU DU G G v

= + + + +
Eq. 9.82
( ) ( )
1 0 1
1
t t t
L DU L G v = + + +
Principles of Econometrics, 4t
h
Edition Page 141
Chapter 9: Regression with Time Series Data:
Stationary Variables



Define the inverse of (1
1
L) as (1
1
L)
-1
such
that:
9.8
Multiplier Analysis
( ) ( )
1
1 1
1 1 1 L L

=
Principles of Econometrics, 4t
h
Edition Page 142
Chapter 9: Regression with Time Series Data:
Stationary Variables

Multiply both sides of Eq. 9.82 by (1
1
L)
-1
:


Equating this with the infinite distributed lag
representation:

9.8
Multiplier Analysis
( ) ( ) ( ) ( )
1 1 1
1 1 0 1 1
1 1 1
t t t
DU L L L G L v

= + + + Eq. 9.83
Eq. 9.84
( )
0 1 1 2 2 3 3
2 3
0 1 2 3


t t t t t t
t t
DU G G G G e
L L L G e

= + + + + + +
= + + + + + +
Principles of Econometrics, 4t
h
Edition Page 143
Chapter 9: Regression with Time Series Data:
Stationary Variables


For Eqs. 9.83 and 9.84 to be identical, it must be
true that:
9.8
Multiplier Analysis
Eq. 9.85
Eq. 9.86
( )
( ) ( )
( )
1
1
1
2 3
0 1 2 3 1 0 1
1
1
= 1
1
1
t t
L
L L L L L
e L v

+ + + + = +
=
Eq. 9.87
Principles of Econometrics, 4t
h
Edition Page 144
Chapter 9: Regression with Time Series Data:
Stationary Variables


Multiply both sides of Eq. 9.85 by (1
1
L) to
obtain (1
1
L) = .
Note that the lag of a constant that does not
change so L =
Now we have:
9.8
Multiplier Analysis
( )
1
1

1 and
1
= =

Principles of Econometrics, 4t
h
Edition Page 145
Chapter 9: Regression with Time Series Data:
Stationary Variables


Multiply both sides of Eq. 9.86 by (1
1
L):
9.8
Multiplier Analysis
( )
( )
( ) ( ) ( )
2 3
0 1 1 0 1 2 3
2 3
0 1 2 3
2 3
0 1 1 1 2 1
2 3
0 1 0 1 2 1 1 3 2 1
1



L L L L L
L L L
L L L
L L L
+ = + + + +
= + + + +
+
= + + + +
Eq. 9.88
Principles of Econometrics, 4t
h
Edition Page 146
Chapter 9: Regression with Time Series Data:
Stationary Variables

Rewrite Eq. 9.86 as:


Equating coefficients of like powers in L yields:




and so on

9.8
Multiplier Analysis
( ) ( ) ( )
2 3 2 3
0 1 0 1 0 1 2 1 1 3 2 1
0 0 L L L L L L + + + = + + + + Eq. 9.89
0 0
1 1 0 1
2 1 1
3 2 1
=

0
0
=
=
=
Principles of Econometrics, 4t
h
Edition Page 147
Chapter 9: Regression with Time Series Data:
Stationary Variables



We can now find the s using the recursive
equations:
9.8
Multiplier Analysis
Eq. 9.90
0 0
1 1 0 1
1 1
=

for 2
j j
j

= +
= >
Principles of Econometrics, 4t
h
Edition Page 148
Chapter 9: Regression with Time Series Data:
Stationary Variables

You can start from the equivalent of Eq. 9.88
which, in its general form, is:



Given the values p and q for your ARDL
model, you need to multiply out the above
expression, and then equate coefficients of like
powers in the lag operator

9.8
Multiplier Analysis
Eq. 9.91
( )
( )
2 2
0 1 2 1 2
2 3
0 1 2 3
1

q p
q p
L L L L L L
L L L
+ + + + =
+ + + +
Principles of Econometrics, 4t
h
Edition Page 149
Chapter 9: Regression with Time Series Data:
Stationary Variables
For the Okuns Law model:


The impact and delay multipliers for the first
four quarters are:

9.8
Multiplier Analysis
1 1
0.3780 0.3501 0.1841 0.0992
t
t t t
DU DU G G

= +
0 0
1 1 0 1
2 1 1
3 2 1
4 3 1

= 0.1841

0.099155 0.184084 0.350116 0.1636

0.163606 0.350166 0.0573

0.057281 0.350166 0.0201

0.020055 0.350166 0.0070
=
= + = =
= = =
= = =
= = =
Principles of Econometrics, 4t
h
Edition Page 150
Chapter 9: Regression with Time Series Data:
Stationary Variables
9.8
Multiplier Analysis
FIGURE 9.13 Delay multipliers from Okuns law ARDL(1,1) model
Principles of Econometrics, 4t
h
Edition Page 151
Chapter 9: Regression with Time Series Data:
Stationary Variables


We can estimate the total multiplier given by:


and the normal growth rate that is needed to
maintain a constant rate of unemployment:
9.8
Multiplier Analysis
0

j
j

0

N j
j
G

=
=

Principles of Econometrics, 4t
h
Edition Page 152
Chapter 9: Regression with Time Series Data:
Stationary Variables
We can show that:


An estimate for is given by:


Therefore, normal growth rate is:
9.8
Multiplier Analysis
1 0 1
0
0
1

0.163606

0.184084 0.4358

1 0.350116
1
j
j

=
+
= + = + =

0.37801

0.5817

0.649884
1
= = =

N
0.5817

G 1.3% per quarter


0.4358
= =
Principles of Econometrics, 4t
h
Edition Page 153
Chapter 9: Regression with Time Series Data:
Stationary Variables
Key Words
Principles of Econometrics, 4t
h
Edition Page 154
Chapter 9: Regression with Time Series Data:
Stationary Variables
AIC criterion
AR(1) error
AR(p) model
ARDL(p,q) model
autocorrelation
Autoregressive
distributed lags
autoregressive
error
autoregressive
model
BIC criterion
correlogram
delay multiplier
distributed lag
weight
Keywords
dynamic models
exponential
smoothing
finite distributed lag
forecast error
forecast intervals
forecasting
HAC standard errors
impact multiplier
infinite distributed
lag
interim multiplier
lag length
lag operator
lagged dependent
variable
LM test
multiplier analysis
nonlinear least
squares
out-of-sample
forecasts
sample
autocorrelations
serial correlation
standard error of
forecast error
SC criterion
total multiplier
T x R
2
form of LM
test
within-sample
forecasts

Principles of Econometrics, 4t
h
Edition Page 155
Chapter 9: Regression with Time Series Data:
Stationary Variables

Appendices

Principles of Econometrics, 4t
h
Edition Page 156
Chapter 9: Regression with Time Series Data:
Stationary Variables


For the Durbin-Watson test, the hypotheses are:

The test statistic is:
9A
The Durbin-
Watson Test
Eq. 9A.1
0 1
: 0 : 0 H H = >
( )
2
1
2
2
1

T
t t
t
T
t
t
e e
d
e

=
=

Principles of Econometrics, 4t
h
Edition Page 157
Chapter 9: Regression with Time Series Data:
Stationary Variables
We can expand the test statistic as:
9A
The Durbin-
Watson Test
Eq. 9A.2
2 2
1 1
2 2 2
2
1
2 2
1 1
2 2 2
2 2 2
1 1 1
1

2


2

1 1 2
T T T
t t t t
t t t
T
t
t
T T T
t t t t
t t t
T T T
t t t
t t t
e e e e
d
e
e e e e
e e e
r

= = =
=

= = =
= = =
+
=
= +
~ +



Principles of Econometrics, 4t
h
Edition Page 158
Chapter 9: Regression with Time Series Data:
Stationary Variables
We can now write:


If the estimated value of is r
1
= 0, then the
Durbin-Watson statistic d 2
This is taken as an indication that the model
errors are not autocorrelated
If the estimate of happened to be r
1
= 1 then
d 0
A low value for the Durbin-Watson statistic
implies that the model errors are correlated,
and > 0

9A
The Durbin-
Watson Test
Eq. 9A.3 ( )
1
2 1 d r ~
Principles of Econometrics, 4t
h
Edition Page 159
Chapter 9: Regression with Time Series Data:
Stationary Variables
9A
The Durbin-
Watson Test FIGURE 9A.1 Testing for positive autocorrelation
Principles of Econometrics, 4t
h
Edition Page 160
Chapter 9: Regression with Time Series Data:
Stationary Variables
9A
The Durbin-
Watson Test
9A.1
The Durbin-
Watson Bounds
Test
FIGURE 9A.2 Upper and lower critical value bounds for the Durbin-
Watson test
Principles of Econometrics, 4t
h
Edition Page 161
Chapter 9: Regression with Time Series Data:
Stationary Variables


Decision rules, known collectively as the Durbin-
Watson bounds test:
If d < d
Lc
: reject H
0
: = 0 and accept
H
1
: > 0
If d > d
Uc
do not reject H
0
: = 0
If d
Lc
< d < d
Uc
, the test is inconclusive


9A
The Durbin-
Watson Test
9A.1
The Durbin-
Watson Bounds
Test
Principles of Econometrics, 4t
h
Edition Page 162
Chapter 9: Regression with Time Series Data:
Stationary Variables
Note that:




Further substitution shows that:


9B
Properties of the
AR(1) Error
( )
1
2 1
2
2 1



t t t
t t t
t t t
e e v
e v v
e v v



= +
= + +
= + +
Eq. 9B.1
( )
2
3 2 1
3 2
3 2 1


t t t t t
t t t t
e e v v v
e v v v


= + + +
= + + +
Eq. 9B.2
Principles of Econometrics, 4t
h
Edition Page 163
Chapter 9: Regression with Time Series Data:
Stationary Variables


Repeating the substitution k times and rearranging:


If we let k , then we have:


9B
Properties of the
AR(1) Error
2 1
1 2 1

k k
t t k t t t t k
e e v v v v

+
= + + + + +
Eq. 9B.3
2 3
1 2 3

t t t t t
e v v v v

= + + + + Eq. 9B.4
Principles of Econometrics, 4t
h
Edition Page 164
Chapter 9: Regression with Time Series Data:
Stationary Variables
We can now find the properties of e
t
:


9B
Properties of the
AR(1) Error
( ) ( ) ( ) ( ) ( )
2 3
1 2 3
2 3
E
0 0 0 0
0
t t t t t
E e E v v E v E v

= + + + +
= + + + +
=
( ) ( ) ( ) ( ) ( )
( )
2 4 6
1 2 3
2 2 2 4 2 6 2
2 2 4 6
2
2
var var var var var

1
1
t t t t t
v v v v
v
v
e v v v v

= + + + +
= o + o + o + o +
= o + + + +
o
=

Principles of Econometrics, 4t
h
Edition Page 165
Chapter 9: Regression with Time Series Data:
Stationary Variables
The covariance for one period apart is:


9B
Properties of the
AR(1) Error
( ) ( )
( )
( )
( ) ( ) ( )
( )
1
2 3
1 2 3
2 3
1 2 3 4
2 3 2 5 2
1 2 3
2 2 4
2
2
cov ,



1

1
t t t t t
t t t t
t t t t
t t t
v
v
e e E e e
E v v v v
v v v v
E v E v E v




=

= + + + +

(
+ + + +

= + + +
= o + + +
o
=

Principles of Econometrics, 4t
h
Edition Page 166
Chapter 9: Regression with Time Series Data:
Stationary Variables



Similarly, the covariance for k periods apart is:


9B
Properties of the
AR(1) Error
( )
2
2

cov , 0
1
k
v
t t k
e e k

o
= >

Principles of Econometrics, 4t
h
Edition Page 167
Chapter 9: Regression with Time Series Data:
Stationary Variables
We are considering the simple regression model
with AR(1) errors:


To specify the transformed model we begin with:


Rearranging terms:


9C
Generalized Least
Squares Estimation
1 2 1

t t t t t t
y x e e e v

=| +| + = +
1 2 1 1 2 1 t t t t t
y x y x v

=| +| + | | + Eq. 9C.1
( ) ( )
1 1 2 1
1
t t t t t
y y x x v

=| +| + Eq. 9C.2
Principles of Econometrics, 4t
h
Edition Page 168
Chapter 9: Regression with Time Series Data:
Stationary Variables


Defining the following transformed variables:


Substituting the transformed variables, we get:

9C
Generalized Least
Squares Estimation
Eq. 9C.3
1 2 1 1
1
t t t t t t t
y y y x x x x
- - -

= = =
1 1 2 2 t t t t
y x x v
- - -
= | + | +
Principles of Econometrics, 4t
h
Edition Page 169
Chapter 9: Regression with Time Series Data:
Stationary Variables


There are two problems:
1. Because lagged values of y
t
and x
t
had to be
formed, only (T - 1) new observations were
created by the transformation
2. The value of the autoregressive parameter is
not known
9C
Generalized Least
Squares Estimation
Principles of Econometrics, 4t
h
Edition Page 170
Chapter 9: Regression with Time Series Data:
Stationary Variables

For the second problem, we can write Eq. 9C.1 as:


For the first problem, note that:

and that
9C
Generalized Least
Squares Estimation
Eq. 9C.4
1 2 1 1 2 1
( )
t t t t t
y x y x v

| | = | | +
1 1 1 2 1
y x e =| + | +
2 2 2 2
1 1 1 2 1
1 1 1 1 y x e = | + | +
Principles of Econometrics, 4t
h
Edition Page 171
Chapter 9: Regression with Time Series Data:
Stationary Variables

Or:


where
9C
Generalized Least
Squares Estimation
1 11 1 12 2 1
y x x e
- - - -
= | + | +
Eq. 9C.5
2 2
1 1 11
2 2
12 1 1 1
1 1
1 1
y y x
x x e e
- -
- -
= =
= =
Eq. 9C.6
Principles of Econometrics, 4t
h
Edition Page 172
Chapter 9: Regression with Time Series Data:
Stationary Variables



To confirm that the variance of e
*
1
is the same as
that of the errors (v
2
, v
3
,, v
T
), note that:
9C
Generalized Least
Squares Estimation
2
2 2 2
1 1
2
var( ) (1 ) var( ) (1 )
1
v
v
e e
-
o
= = = o