Vous êtes sur la page 1sur 23

# Matlab Session 6:

## Time Series Models

Daria Kalyaeva
Swiss Finance Institute

1 / 23

Todays Plan

Todays plan

## Date and time;

Graphics for time series;
Estimating time series models;
Testing;
Forecasting.

## November 13, 2015

2 / 23

External Toolboxes

External toolboxes
Kevin Sheppard toolbox

We will use the Kevin Sheppard Toolbox for time series models.
Available from http://www.kevinsheppard.com/MFE_Toolbox.
See notes from Matlab Session 4 to see how to add the path to this
toolbox to Matlab directory.
Functions from the toolbox will be marked with a KS next to them.

3 / 23

## Date and Time

How to treat dates in Matlab?

To plot time series, you need to know the date and time of observations.
Matlab can treat date and time in two formats: string and number.
datestr(numDate,formatOut)
I

## Converts dates in number form in vector numDate to an array of dates

in string form.
formatOut: optional; specifies the output format for date strings.
Default is dd-mmm-yyyy HH:MM:SS.

datenum(strDate,formatIn)
I
I

## Converts dates in string form to number form.

formatIn: optional (but I strongly recommend you use it every
time!); specifies what format the strings are in.

Example
datestr(736285,mm/dd/yyyy)
datenum(17-Nov-2015,dd-mmm-yyyy)
Daria Kalyaeva (SFI)

## Session 6: Time Series Models

gives 11/17/2015.
gives 736285.
November 13, 2015

4 / 23

## Date and Time

Including dates on the axes

You need to be able to manipulate dates and times in Matlab to plot time
series.
datetick(axis,dateFormat)
Converts dates in numerical form on the specified axis (x or y)
to dates in specified format (1 dd-mmm-yyyy, see Matlab Help
for the full list of formats).

Example
Get date ticks on the x-axis in the format mmm-yy (e.g. Jan-15 for January
2015):
plot(dateAsNum,Y)
datetick(x,mmm-yy)

5 / 23

## Graphics for time series

What to keep in mind?

ACF/PACF plots are the starting point for ARMA model selection:
we can see whether the series are autocorrelated and if so, what the
autocorrelation pattern is.

6 / 23

## Graphics for time series

Graphics functions

[AC,ACstErr] = sacf(X,k,robust)

(KS)

## Computes sample autocorrelation for k lags for data in vector X.

robust: optional; indicator argument. Set to 1 (default) for
heteroskedasticity-robust standard errors, set to 0 for non-robust
errors.
I

I do not use the standard errors from this function in the examples, so I
set this to 0 to avoid plotting graphs automatically.

## AC: autocorrelation vector for each lag;

ACstErr: standard errors for each autocorrelation value. Replace
with to suppress output.
[PAC,PACstErr] = spacf(X,k,robust)

(KS)

7 / 23

## Graphics for time series

How to read ACF and PACF?

ACF and PACF plots allow us to determine what ARMA model to use.
Exponential decline pattern in ACF/PACF can be present in
slightly different forms but overall it is the same thing:
I
I

## either in real or in abosolute values;

after the initial decline, ACF/PACF can trail off or fluctuate about 0.

## Unofficial, but my base rule is to KEEP IT SIMPLE!

ACF/PACF plots for real data are never as clean as they are
in simulations!

8 / 23

## Graphics for time series

How to read ACF and PACF?

I

ACF
F
F

## identifies lags for MA part (q);

estimates correlation with lag k, including all indirect effects through
the lags in between;

PACF
F
F

## identifies lags for AR part (p);

estimates correlation with lag k, excluding the impact the k 1 lags in
between (i.e. after eliminating the effect of the AR(k 1) model).

Typical patterns:
I

I
I

## Pure AR(p), ARMA(p,0): PACF highly significant at lags up to p,

then cut-off; ACF declines exponentially in absolute value.
Pure MA(q), ARMA(0,q): ACF significant up to lag q, then cut-off;
PACF declines exponentially in absolute value.
Mixed ARMA(p,q): both ACF and PACF exhibit exponential decline.
Nonstationary series have significant ACF for many lags.

9 / 23

## Graphics for time series

Number of lags and confidence bands

Confidence bands: 2 T .
The choice of the number of lags: we want enough lags to see a pattern
but not so many that AC estimates become unreliable.

10 / 23

## Graphics for time series

ACF and PACF for simulated data - it is not always obvious!

ARMA(1,0) ACF

ARMA(1,0) PACF

0.4
PAC

AC

0.4
0.2

0.2
0

0
0

10

15

20

ARMA(0,1) ACF

15

20

0.4
PAC

0.4
AC

10

ARMA(0,1) PACF

0.2

0.2
0

0
0

10

15

20

0.6
0.4
0.2
0
0

10

15

10

15

20

ARMA(1,1) PACF
PAC

AC

ARMA(1,1) ACF

20

0.6
0.4
0.2
0
-0.2
0

10

15

20
11 / 23

## Graphics for time series

ACF and PACF for simulated data - it is not always obvious!

ARMA(1,0) PACF
PAC

AC

ARMA(1,0) ACF

0.08
0.06
0.04
0.02
0
-0.02
0

10

15

0.08
0.06
0.04
0.02
0
-0.02

20

ARMA(0,1) ACF

15

20

PAC

0.1
0.05

0
0

10

15

20

ARMA(1,1) ACF

10

15

20

ARMA(1,1) PACF

0.2

0.2
PAC

AC

0.1
0.05

AC

10

ARMA(0,1) PACF

0.1

0.1
0

0
0

10

15

20

10

15

20
12 / 23

## Time series model estimation

What to keep in mind?

## The goal of ARMA models is to find the entire predictable element in a

time series.
You want to extract all information about the time series evolution
from its past values (this will be the predicted values, y ).
I

## If the series is autocorrelated, there is information about future values

in the past values.

Then the only thing left should be the unpredictable error terms!
I

## This is why we test any ARMA model by testing for autocorrelation in

error terms: if there is still autocorrelation, then we have not extracted
all information about the future from past values.
Also, for AR(p) models, if you estimate them using OLS,
autocorrelated residuals imply that the estimates are not consistent.

13 / 23

## Time series model estimation

ARMA flow chart

Plot
series

Mean
stationary
?

yes

Select
model using
ACF/PACF
plots

Estimate
parameters

Uncorrelated
residuals
?

yes

Significant
coefficients
?

no

no

no

Remove
trend,
seasonality

parameters

Remove
parameters

yes

Forecast

Modify
model

build-or-make-your-own-arima-forecasting-model

14 / 23

## Time series model estimation

Time series model functions (Kevin Sheppard Toolbox)

## [B, LL, e, stErr, diagnostics, VCVr, VCV, lik] =

armaxfilter(y,constant,pLags,qLags)

(KS)

## Estimates an ARMA(p,q) model for data in vector y.

constant: if set to 1, constant is included; if set to 0, it is not.
pLags,qLags: vectors of lags to be included.

Example
AR(p) model with constant
MA(q) model with constant

## Daria Kalyaeva (SFI)

armaxfilter(y,1,(1:p))
armaxfilter(y,1,[],(1:q))

15 / 23

## Time series model estimation

Time series model functions (Kevin Sheppard Toolbox)

## [B, LL, e, stErr, diagnostics, VCVr, VCV, lik] =

armaxfilter(y,constant,pLags,qLags)
(KS), continued...
B: coefficient estimates;
LL: log-likelihood;
stErr: non-robust standard errors;
diagnostics: structure with diagnostics;
WARNING: the AIC and BIC formulas differ from those in
class, so compute your own AIC, BIC!
VCVr: heteroskedasticity-robust variance-covariance matrix;
VCV: non-robust VCV matrix;
lik: likelihood of each observation.

16 / 23

## Time series model estimation

Numerical optimization

## Remember the following when you run estimations in Matlab:

The optimization is done numerically: Matlab starts with a guess
for the estimates, marginally adjusts these values depending on the
derivatives, and then iterates this step until convergence, defined by
the tolerance level.
To prevent infinite loops, there is a limit for the number of times
the iteration happens.
So in regressions that do not converge well, you may have a warning
that the loop has been iterated the maximum number of times and it
stops, although the error in the results is above the tolerance level.
This means that the results are not very reliable, and you should note
this in your research and assignments.

17 / 23

## Time series model estimation

Interpreting the results

## In general, the same interpretation points as given in Session 3 apply (e.g.

economic intuition? significant coefficients? etc.).
Some differences:
Use AIC and BIC for model comparison.
log(L)
p
+2 ,
T
T
log(L)
log(T )
BIC = 2
+p
.
T
T
AIC = 2

I
I

## p is the number of parameters estimated: the number of coefficients in

the model (including the intercept) AND the variance of the error
terms!
BIC imposes a stricter penalty for extra parameters than AIC.
Choose model with smallest AIC/BIC.

18 / 23

## Time series model estimation

Interpreting the results

Parameter interpretation:
I

AR parameters:
F
F

PAC;
for returns/growth: the coefficient estimate is the percentage change in
current value due to a 1% change in past value.

MA parameters:
F
F

## Persistence of past shocks;

for returns/growth: the coefficient estimate is the percentage change in
current value due to a 1% change in past shock.

## November 13, 2015

19 / 23

Testing

Testing
What to keep in mind?

## Along with all the usual hypothesis testing (coefficient significance,

etc.), we are specifically interested in whether the time series are
autocorrelated (before running the ARMA) and whether the model
errors are autocorrelated (after running ARMA).
ACF and PACF plots provide a visual aid, but we can also perform a
statistical test for the presence of autocorrelation.
I

## The Portmanteau tests are a collection of tests (there are different

variations) for testing the null hypothesis of no autocorrelation in the
time series.
The Ljung-Box test is one such test, which we will use in class:
F

F
F

H0 : k = 0 k {1, ..., K };
H1 : k 6= 0 for some k {1, ..., K }.
P
2k
Test statistic: Q = T (T + 2) Kk=1 T k
.
Under H0 , Q 2K . Reject H0 if Q > 2K ,1 .

NOTE: Do not use the Ljung-Box test function from the Kevin
Sheppard toolbox; it performs a different test.
Daria Kalyaeva (SFI)

## November 13, 2015

20 / 23

Testing

Testing
Ljung-Box test

(D)

LBresult = LjungBoxTest(X,K,alpha)

## Performs the Ljung-Box test of order K for data in column vector X at

significance level alpha.
The output is a dataset indicating the Q-statistic, the critical value
2K , , and the p-value.

## November 13, 2015

21 / 23

Testing

Testing
Joint coefficient hypothesis testing

Use the LRtest(.) function for the likelihood ratio test (presented in
Session 4 slides).

## November 13, 2015

22 / 23

Forecasting

Forecasting
What to keep in mind?

## We can forecast once the residuals contain no autocorrelation.

The fact that you have filtered out autocorrelation from the series
(linear dependence) generally does not mean that there is no more
information contained in past values about the future ones
(non-linear dependence) next terms course.
Do not forget what your you are forecasting!

Example
We have T observations of stock price, p.
Estimated AR(1) model for stock returns, rt = log(pt+1 ) log(pt ):
rt = + 1 rt1 .
Forecast for price in period T + 1:
1
rT = + 1 rT 1 ;
2

pT +1 = e rT +log(pT ) = pT e rT .
Daria Kalyaeva (SFI)

23 / 23