Académique Documents
Professionnel Documents
Culture Documents
Anindya Chakraborty
Instructor: Prof. Mandira Sharma
CITD, SIS, JNU
The series that we have generated are MA(2), AR(1), AR(2), and ARMA(1,1). We
know MA process is always stationary. For the AR processes to be stationary, we
have considered the coefficients to be less than unity. The coefficient of MA process
is also taken to be less than unity. For the ARMA process to be stationary and
invertible, we have considered both the coefficients to be less than unity. The series
generated in Stata is given in the datasheet at the end. The commands used in
generating the series are given in the list of commands used. The constant term
used in MA(2) process is taken as 0.5 and 0.25, same for AR(2) also. In AR(1) the
lone constant used is 0.5. in ARMA(1,1) the constants are both 0.5 and 0.5.
We have also generated the correlogram diagrams for each series, that shows the
autocorrelation of a particular variable and its lagged values. For the MA(2) process
we have derived the autocorrelation function (ACF) for the correlogram diagram and
for the AR(1), AR(2) and ARMA(1,1) processes we have derived the partial
autocorrelation (PACF). The graphs have been attached below.
So the behaviours displayed by the correlogram diagram are consistent with the
underlying theoretical process. The correlogram diagrams are derived for 15 lags.
First we show the correlogram diagrams and white noise term et as follows:
White noise~N(0,1)
These are the correlogram diagrams for the given processes. The AR processes
are stationary and ARMA process is stationary and invertible. The correlogram
diagrams show that the correlations of order till the order of the process are
significant, ie, they are above the shaded region. However the later correlations are
within the shaded region and almost near 0, implying they are negligible. The
shaded region shows the probability of rejecting the null hypothesis that the
correlations are significantly different from 0, the test being conducted at 95% level
of significance.
Hence the series generated by us, and the correlograms derived out of the series
show behaviours that are consistent with the theoretical idea.
Next, we show the data series generated in Stata for MA(2), AR(1), AR(2) and
ARMA(1,1) and also the white noise series et.
et Xt Xt Xt Xt
These are the series generated in Stata for white noise, MA(2) process, AR(1)
process, AR(2) process and ARMA(1,1) process for 1000 observations.
Next we give the set of commands used in Stata to generate the series.
. gen t=_n
. tsset t
time variable: t, 1 to 1000
delta: 1 unit
. gen et=rnormal(0,1)
. line et t
. local theta1=0.5
. replace Xt=`theta1'*e[2-1]+et in 2
. gen Xt=.
(1000 missing values generated)
. replace Xt=et in 1
(1 real change made)
. replace Xt=`theta1'*e[2-1]+et in 2
(1 real change made)
. forvalues i=3(1)1000 {
2. quietly replace Xt=`theta1'*et[`i'-1]+`theta2'*et[`i'-2]+et in `i'
3. }
. ac Xt, lags(20)
.
. tsset t
time variable: t, 1 to 1000
delta: 1 unit
. gen et=rnormal(0,1)
.
. gen Xt=.
(1000 missing values generated)
. local phi=0.5
. replace Xt=et in 1
(1 real change made)
. forvalues i=2(1)1000 {
2. quietly replace Xt=`phi'*Xt[`i'-1]+et in `i'
3. }
. gen t=_n
.
. tsset t
time variable: t, 1 to 1000
delta: 1 unit
.
. gen et=rnormal(0,1)
.
. gen Xt=.
(1000 missing values generated)
. local phi1=0.5
. local phi2=0.25
. replace Xt=et in 1
(1 real change made)
. replace Xt=`phi1'*X[2-1]+et in 2
(1 real change made)
. forvalues i=3(1)1000 {
2. quietly replace Xt=`phi1'*Xt[`i'-1]+`phi2'*Xt[`i'-2]+et in `i'
3. }
. gen t=_n
.
. tsset t
time variable: t, 1 to 1000
delta: 1 unit
.
. gen Xt=.
(1000 missing values generated)
.
. local phi1=0.5
.
. local theta1=0.5
.
. replace Xt=et in 1
(1 real change made)
.
. forvalues i=2(1)1000 {
2.
. quietly replace Xt=`phi1'*Xt[`i'-1]+`theta1'*et[`i'-1]+et in `i'
3.
.}
.
. pac Xt, lags(20)