Académique Documents
Professionnel Documents
Culture Documents
Ani Katchova
White noise
-4
-2
white_noise
White noise describes the assumption that each element in a series is a random draw from a
population with zero mean and constant variance.
10
20
30
40
50
_t
Autoregressive (AR) and moving average (MA) models correct for violations of this white
noise assumption.
or 1
0.8
0.8
AR(1) with
-3
-4
-2
-2
-1
ar_1b
ar_1a
0
AR(1) with
10
20
30
40
50
_t
10
20
30
_t
40
50
MA(1) with
0.7
-4
-4
-2
-2
ma_1a
0
ma_1b
0
MA(1) with
10
20
30
40
50
10
20
30
_t
_t
40
50
0.8 and
0.7
0.8 and
ARMA(1,1) with
0.7
-4
-5
-2
arma_11b
0
arma_11a
0
ARMA(1,1) with
10
20
30
40
50
_t
10
20
30
_t
40
50
Stationarity
Modeling an ARMA(p,q) process requires stationarity.
A stationary process has a mean and variance that do not change over time and the process
does not have trends.
An AR(1) disturbance process:
is stationary if | | 1 and is white noise.
100
150
y
200
250
300
1980q1
1985q1
1990q1
yearqtr
1995q1
2000q1
Detrending
A variable can be detrended by regressing the variable on a time trend and obtaining the
residuals.
Detrended variable:
100
150
200
Residuals
-20-10 0 102030
250
300
Variable
1980q1
1985q1
1990q1
yearqtr
y
1995q1
Linear prediction
2000q1
1980q1
1985q1
1990q1
yearqtr
1995q1
2000q1
Differencing
When a variable is not stationary, a common solution is to use differenced variable:
, for first order differences.
The variable is integrated of order one, denoted I(1), if taking a first difference produces a
stationary process.
ARIMA (p,d,q) denotes an ARMA model with p autoregressive lags, q moving average lags, a
and difference in the order of d.
-10
D.y
0 10
20
Differenced variable:
1980q1
1985q1
1990q1
yearqtr
1995q1
2000q1
Seasonality
Seasonality is a particular type of autocorrelation pattern where patterns occur every season,
like monthly, quarterly, etc.
For example, quarterly data may have the same pattern in the same quarter from one year to the
next.
Seasonality must also be corrected before a time series model can be fitted.
10
1.
We can estimate the above model and test for the significance of the coefficient.
o If the null hypothesis is not rejected, 0, then
is not stationary. Difference the
variable and repeat the Dickey-Fuller test to see if the differenced variable is stationary.
o If the null hypothesis is rejected, 0, then
is stationary. Use the variable.
o Note that non-significance is means stationarity.
11
0 and
0.
12
.
The autocorrelation function ACF(k) gives the gross correlation between and
For an AR(1) model, the ACF is
. We say that this function tails off.
Partial autocorrelation function (PACF)
PACF is the simple correlation between
intervening lags
Corr
where
,,
,,
and
|
13
by
MA(q)
ARMA(p,q)
ACF
Tails off
Tails off
PACF
Tails off
Tails off
14
Autocorrelations of ar_1a
-0.50
0.00
0.50
1.00
1.00
10
Lag
15
20
10
Lag
15
20
Autocorrelations of ar_1b
-0.60-0.40-0.200.00 0.20 0.40
10
Lag
15
20
15
10
Lag
15
20
Autocorrelations of ma_1a
-0.40-0.200.00 0.20 0.40
10
Lag
15
20
10
Lag
15
20
Autocorrelations of ma_1b
-0.40-0.200.00 0.20 0.40
10
Lag
15
20
16
10
Lag
15
20
Autocorrelations of arma_11a
-0.50
0.00
0.50
1.00
1.00
10
Lag
15
20
10
Lag
15
20
-1.00
-1.00
Autocorrelations of arma_11b
-0.50
0.00
0.50
10
Lag
15
20
17
10
Lag
15
20
-1.00
Autocorrelations of xt
-0.50
0.00
0.50
1.00
ACF of non-stationary series - The ACF shows a slow decaying positive ACF.
10
Lag
15
20
-0.50
Autocorrelations of xt
0.00
0.50
1.00
ACF with seasonal lag (4) ACF shows spikes every 4 lags.
10
20
30
Lag
Bartlett's formula for MA(q) 95% confidence bands
18
where
is the sample autocorrelation at lag k.
Under the null hypothesis that the series is white noise (data are independently distributed), Q
has a limiting
distribution with p degrees of freedom.
Goodness of fit
Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC) are two
measures goodness of fit. They measure the trade-off between model fit and complexity of the
model.
AIC
2 ln
2
BIC
2 ln
ln
where is the value of the likelihood function evaluated at the parameter estimates,
number of observations, and is the number of estimated parameters.
A lower AIC or BIC value indicates a better fit (more parsimonious model).
19
is the
20
Estimation step
Estimate ARMA models and examine the various coefficients.
The goal is to select a stationary and parsimonious model that has significant coefficients and a
good fit.
Diagnostic checking step
If the model fits well, then the residuals from the model should resemble a while noise process.
o Check for normality looking at a histogram of the residuals or by using a quantilequantile (Q-Q) plot.
o Check for independence by examining the ACF and PACF of the residuals, which should
look like a white noise.
o The Ljung-Box-Pierce statistic performs a test of the magnitude of the autocorrelations of
the correlations as a group.
o Examine goodness of fit using the Akaike Information Criteria (AIC) and Bayesian
Information Criteria (BIC). Use most parsimonious model with lowest AIC and/or BIC.
21