Vous êtes sur la page 1sur 32

# 2.

## The Estimator: Let ( x1 ,L, xT ) represent a realization of an ergodic weakly

stationary process {xt } at T consecutive times, ( x'1 , L, x'T ) the time series of
deviations from the sample mean and {x't } the corresponding process. The auto-
covariance and auto-correlation functions are estimated using
1 T
r (τ ) = c(τ ) / c(0), c(τ ) = ∑ X 't −|τ | X 't
T t =|τ |+1

## The Bias of the Estimator of the Auto-correlation Function (Kendall)

1
• If Xt is a white noise: B (r (τ )) ≈ −
T
• If Xt is an AR(1) process with parameter α1: B(r (1)) ≈ − 1 (1 + 4α1 )
T
1 ⎛ 1 + α1 ⎞
B (r (τ )) ≈ − ⎜⎜ (1 − α1|τ | ) + 3 | τ | α1|τ | ⎟⎟ , | τ |> 1
T ⎝ 1 − α1 ⎠

Negative bias!
2.2 Time Series Analysis

Example: Bias

## One thousand samples of The lag-1 correlation

length T=15, 30, 60 and
120 were generated from Sample length
AR(1) processes with α1 ρ1 15 30 60 120
parameter α1=0.3, 0.6
and 0.9. The samples are 0.3 0.3 0.16 0.23 0.27 0.28
used to estimates the 0.6 0.6 0.36 0.47 0.54 0.57
auto-correlation function
at lag-1 and lag-10. One 0.9 0.9 0.54 0.72 0.81 0.86
finds
• that the bias is negative The lag-10 correlation
• that the bias decreases Sample length
slowly with increasing α1 Ρ10 15 30 60 120
sample size
0.3 0.0 -0.06 -0.04 -0.03 -0.01
• that the bias is large
when τ is large relative to 0.6 0.01 -0.07 -0.08 -0.05 -0.02
T 0.9 0.35 -0.15 -0.14 0.02 0.18
2.2 Time Series Analysis

## It can be shown (Bartlett) that for a stationary normal process

Var (r (τ )) ≈ (
1 l =∞ 2
∑ ρ (l ) + ρ (l + τ ) ρ (l − τ ) − 4 ρ (τ ) ρ (l ) ρ (l − τ ) + 2 ρ 2 (l ) ρ 2 (τ )
T l = −∞
)
Thus, if there exists a p such that ρ(τ)=0 for τ greater than p, then
1⎛ l= p

Var (r (τ )) ≈ ⎜⎜1 + 2∑ ρ 2 (l ) ⎟⎟ , τ > p
T⎝ l =1 ⎠

Implication
The above result can be used to assess the likehood of having ρ(τ)=0
1. Assume ρ(l)=0 for l ≥ τ
2. Substitute r(l), 1≤ l<τ, into the above equation to obtain an estimate of
Var(r(t))=σˆ r2(τ )
3. The variable Z=r(t)/ σˆ r (τ ) is standard normal distributed
The probability for having |ρ(τ)| being smaller than 1.96 × σˆ r (τ ) is then
95%. These values of r(τ) will be frequently observed, even when the
true correlation ρ(τ) is zero
2.2 Time Series Analysis

## The Cross-covariance of the Estimator of the Auto-correlation Function

It can be shown (Bartlett, Box and Jenkins) The correlations between the auto-
1 ∞ correlation function estimates are roughly
Cov(r (τ ), r (τ + δ )) ≈ ∑ ρ (l ) ρ (l + δ )
T l = −∞ similar to those of the process itself.
Consequently, when the process is
and for an AR(1) process with parameter persistent, the estimated auto-correlation
α1>0 function will vary slowly around zero, even
when the real auto-correlation function has
Cov(r (τ ), r (τ + δ )) ≈ α1δ
decayed to zero.
at large lags τ Be careful with the interpretation!

## Estimated auto-correlation functions computed from time series of

length 240 generated by an AR(1) with α1=0.9. The horizontal dashed
lines indicate approximate critical values for testing the null hypothesis
that ρ(τ)=0 for all τ at the 95% significance level

± 1.96σˆ r (τ )
2.2 Time Series Analysis

## Non-parametric Estimation of the Spectrum

Non-parametric estimates of spectrum are based on the periodogram
Let {x1,…,xT} be a time series. xt with t=1,…,T can be expanded in terms of sine and
cosine functions as
xt = a0 + ∑ (a j cos(2πω j t ) + b j sin( 2πω j t ) ), with
q

j =1
¬ ¬
T T T
ω j = j / T , j = 1, L, q, q = , = the largest integer contained in , and
2 2 2
T T T
1 2 2
a0 = ∑ xt , a j = ∑ xt cos(2πω j t ), b j = ∑ xt sin( 2πω j t ), for j = 1, L, q.
T t =1 T t =1 T t =1
1 T
Note that, for even T , aq = ∑
T i =1
(−1) q xt , bq = 0

## The periodogram is defined in terms of the coefficients aj and bj as

I Tj =
4
(
T 2
)
a j + b 2j , with

¬ ¬
⎛ T −1⎞ T
a j = a− j , b j = b− j and j = − ⎜ ⎟,L ,
⎝ 2 ⎠ 2
2.2 Time Series Analysis

## Non-Statistical Properties of the Periodogram

• As other estimators, the periodogram is a function of the sample
• Different from most of estimators, the periodogram reveals an extremely strong
dependence on the sample.
This is because the Fourier analysis, which leads to the calculation of aj and bj and hence ITj,
provides a perfect representation of the sample, in form of a coordinate transformation with the
number of coordinates, aj and bj, being exactly T.

## • IT0=(T/4)a02 equals (T/4) times squared of the sample mean

• The periodogram distributes the sample variance:
2 q
Vaˆr ( X t ) = ∑ I Tj , when T is odd
T j =1
2 q −1 1
Vaˆr ( X t ) = ∑
T j =1
I Tj + I Tq ,
T
when T is even

## • The periodogram carries the same information as the sample auto-covariance

function

∑ c(τ )e
− 2πiω jτ
I Tj =
τ
= −∞
2.2 Time Series Analysis

## Statistical Properties (i.e. bias and variance) of the Periodogram:

These properties are related to the covariance of the Fourier Coefficients ZTj with
2 T
zTj=aj-ibj = ∑ xt e − 2πiω t
, ITj=T/4 |zTj|2 j

T t =1

## Assume that x1,…,xT come from a zero mean ergodic weakly

stationary process. The covariance of the ZTj can be written as
T=50
4 ⎛ 2πiω s ⎞
E ( Z Tj Z Tj* ) = 2 E ⎜ ∑ X t e × ∑ X se j ⎟
− 2πiω j t

T ⎝ t s ⎠
4
= 2 ∑∑ E ( X t X s )e
− 2πi (ω j t −ω k s ) T=10
T t s
4
= 2 ∑∑ γ (t − s )e
− 2πi (ω j t −ω k s )

T t s
1/ 2
4
=
T2
∑∑ ∫ Γ(ω )e π ω 2 i ( t − s ) − 2πi (ω j t −ω k s )
e dω
t s −1 / 2 As T increases, HT
4
1/ 2
(T +1)πi (ω −ω j ) develops into a function
=
T2 ∫ Γ(ω ) H
−1 / 2
T (ω − ω j )e × H T (ω − ω k )e (T +1)πi (ω −ω k ) dω
with a narrow central
⎛ ω j + ωk ⎞ (T +1)πi (ω k −ω j ) 1/ 2
spike of height T and
4
= 2 Γ⎜⎜ ⎟⎟e × ∫ H T (ω − ω j ) H T (ω − ω k )dω for large T width 1/T and with side
T ⎝ 2 ⎠ −1 / 2
lobes separated by zeros
⎧⎪ 0 for j ≠ k at
=⎨ 4
⎪⎩T 2 Γ(ω j ) for j = k
2.2 Time Series Analysis

## Statistical Properties of the Periodogram:

Bias of the Periodogram

## The covariance of the Fourier Coefficients ZTj suggest

( )
1/ 2
4 4
E | Z Tj | ≈ 2 Γ(ω ) ∫ H T (ω − ω j ) 2 dω = Γ(ω j )
2

T −1 / 2
T

## Thus, the j th periodogram ordinate ITj is an asymptotically unbiased

estimator of the spectral density at frequency ωj

## The periodogram appears as a reasonable

estimator of the spectrum, since it is
asymptotically unbiased and estimates at
(an advantage over estimator of the auto-
covariance function). But it has large
variance…
2.2 Time Series Analysis

## Statistical Properties of the Periodogram:

Distribution of the Periodogram

## When {Xt} is an ergodic and weakly stationary process, the periodogram

ordinates are asymptotically proportional to independent χ2(2) random
variables. It can be shown (Brockwell and Davis)

⎪ Γ(0) χ (1) j=0
2

⎪⎪ Γ(ω j ) 2
I Tj ~ ⎨ χ (2) 1 ≤ j ≤ the largest integer in (T - 1)/2
⎪ 2
⎪ Γ(1 / 2) χ 2 (1) j=
T
if T is even The sign ~ indicates
⎪⎩ 2 that the left side is a
The variance of the periodogram is random variable with
the distribution given
⎧ on the right hand side
⎪ Γ(0) j=0
2

Var ( I Tj ) = ⎨Γ(ω j ) 2 1 ≤ j ≤ the largest integer in (T - 1)/2
⎪ T
⎪Γ(1 / 2) j = if T is even
2

⎩ 2

## Thus, the periodogram lacks both efficiency

and consistency!
2.2 Time Series Analysis

## Examples demonstrating the variance of periodogram

Example I Example II
The periodogram of a white noise The periodogram of a time series of
time series of length T=120, 240 length T=240 generated by the
AR(2) process with (α1,α2)=(0.9,-
0.8), plotted in decibel scale, i.e. ωj
true versus 10log10(ITj)
spectrum

true
spectrum

true
spectrum
2.2 Time Series Analysis

## • While the periodogram is an asymptotically unbiased estimate of the

spectral density, it can be biased for finite samples if the spectrum is not
very smooth or represents a line spectrum

## The periodogram of a time series of

length 240 generated from process
Yt=Xt+10sin(2π0.3162t+0.5763) where Xt
is an AR(2) process with
(α1,α2)=(0.3,0.3). The true spectrum
(dashed line) has a spectral line at
ω=0.3162

## • This is because periodogram is an unbiased estimator of the convolution

of the true spectrum Γ(ω) with the square of the Fourier transform HT of
the data window
1/ 2
4
E (| Z Tj | ) ≈ 2 ∫H (ω − ω j ) 2 Γ(ω )dω
2
T
T −1 / 2

## results from the data window and can cause

substantial variance leakage through the side lobes
2.2 Time Series Analysis

In the time domain, the observed time series can be thought of as the
product of the infinite series and a data window
⎧1 if 1 ≤ t ≤ T
ht = ⎨ whose Fourier transform is H T (ω )
⎩0 otherwise

To reduce the variance leakage, this window is oft replaced by one of the
following tapers

## Frequently used data windows

(tapers):
• Hanning or cosine bell
• The split cosine bell, where the
number of non-unit weights is 2m

## Characteristics of some popular data tapers:

Left: The box car (solid), Hanning (dots) and
split cosine bell (dashed) data windows for a
time series of length T=50. The split cosine bell
uses m=T/4 so that 25% of the data are
tapered at each and of the time series.
Right: Corresponding window functions
2.2 Time Series Analysis

## The periodograms of a time series of

length 240 generated from process
Yt=Xt+10sin(2π0.3162t+0.5763) where
Xt is an AR(2) process with
(α1,α2)=(0.3,0.3), computed after
tapering with a split cosine bell with
m=T/4 (top), T/2 (bottom).

Costs to be paid:
• While the contamination of the periodogram from remote frequencies is
reduced, information from adjacent frequencies tends to be smeared ‘together’
since smooth tapers have HT with wider central peaks
• While the asymptotical properties of the periodogram still hold, larger samples
are needed to achieve distributional approximations of the same quantity when
the data are tapered
2.2 Time Series Analysis

## Another Problem of Bias: The Estimator of Γ(0)

Γ(0) equals the total variance for a white noise process, or the low-frequency
variance near the origin for short-memory processes

## ⎧0 if sample mean is substructed

IT ,0 = ⎨ 2
⎩ TX otherwise
2.2 Time Series Analysis

## Frequently Used Spectral Estimators

(The Periodogram Derived Spectral Estimator)

## The statistical properties of the periodogram (asymptotically unbiased

but inconsistent and not efficient) suggest that one should use the
following smoothed, rather than the raw, periodogram as spectral
estimators:
• the ‘chunk’ spectral estimator
• the Daniell spectral estimator
• the Bartlett spectral estimator
• the Parzen spectral estimator

## The smoothing acts to reduce the

variance of the periodogram
2.2 Time Series Analysis

## The ‘Chunk’ Spectral Estimator I

The idea:
• divide the time series into m chunks of equal length M, Note: The estimate at
each frequency is
• compute the periodograms of each chunks representative of a
• estimate the spectrum using the averaged periodogram spectral bandwidth of
approximately 1/M

The estimator can be made consistent and asymptotically unbiased when both
m and M increase with increasing sample length
⎧ Γ(0) 2
⎪ m χ ( m) j=2
⎪ Γ(ω )
ˆΓ(ω j ) = 1 ∑ I Tjl ~ ⎪⎨
m
j
χ 2 ( 2m) 1 ≤ j ≤ the largest integer in ( M − 1) / 2
m l =1 ⎪ 2m
⎪ Γ(1 / 2) χ 2 (m) j=
M
if M is even
⎪⎩ m 2

## The variance goes to

zero, as 1/m goes to zero
2.2 Time Series Analysis

## The ‘Chunk’ Spectral Estimator II

The construction of an asymptotic ~p ×100% confidence interval for Γ(ωj)

Since asymptotically
2mΓˆ (ω j )
~ χ 2 ( 2 m)
Γ(ω j )
one has
⎛ 2mΓˆ (ω j ) ⎞ ⎛ 2mΓˆ (ω j ) 2mΓˆ (ω j ) ⎞
p = P⎜ a ≤
~ ≤ b ⎟ = P⎜ ≤ Γ(ω j ) ≤ ⎟
⎜ Γ (ω ) ⎟ ⎜ b a ⎟
⎝ j ⎠ ⎝ ⎠
~ ~
where a and b are the (1 − p ) / 2 and (1 + p ) / 2 critical values of the χ 2 (2m) distribution, or
using the log representation
⎛ 2m ⎞ ⎛ 2m ⎞
log⎜ ⎟ + log(Γˆ (ω j )) ≤ log(Γ(ω j )) ≤ log⎜ ⎟ + log(Γˆ (ω j ))
⎝ b ⎠ ⎝ a ⎠

## Remember the meaning of a confidence interval:

For every 100 independent interval estimates made, the
interval is expected to cover the true parameter ~p ×100
times on average
the smaller the interval, the more accurate is the
estimate
The Χ2 Distribution
2.2 Time Series Analysis

## The Daniell Spectral Estimator

(the Simplest Smoothed Spectral Estimator)
• Idea: To reduce the variance of the periodogram using a moving average of the
periodogram ordinates
• Definition: for an odd integer n with 1 ≤ n ≤ q , the Daniell estimator is given by
j + ( n −1) / 2
ˆΓ(ω ) = 1 ∑ I
j Tk
n k = j −( n −1) / 2

When n is small relative to T and when the spectral density function is smooth enough so that it is
roughly constant in every frequency interval of the length n/T , the Daniell estimator has the
following properties
( )
1. E Γˆ (ω j ) ≈ Γ(ω j )
bandwidth
( )
2. Var Γˆ (ω j ) ≈ (Γ(ω j ) )
1
n
2

⎧ n− | j − k |
( ) ⎪
3. Cov Γˆ (ω j ), Γˆ (ω k ) ≈ ⎨ n2
Γ(ω j )Γ(ω k ), | j − k |≤ n
⎪⎩ 0 otherwise
Γ(ω j ) 2
4. Γˆ (ω j ) ~ χ ( 2 n)
2n

The last property allows the construction of asymptotic confidence intervals. The critical values are
obtained as in the case of the chunk estimator by replacing m by n.
2.2 Time Series Analysis

## Daniell estimates of the spectrum of the AR(2) process with (a1,a2)=(0.9,-0.8)

plotted on the decibel scale. The cross in the upper right corner indicates the width of
the 90% confidence interval (vertical bar) and the bandwidth (horizontal bar). The
dashed curve displays the true spectrum.

Periodogram
2.2 Time Series Analysis

## Tradeoffs between bias and variance

A larger value of n leads to a stronger
reduction of variance, but also to a wider
frequency band. The associated
density can introduce substantial bias.
2.2 Time Series Analysis

## An Alternative Representation of the Daniell Spectral Estimator

The Daniell estimator can be re - expressed as the convolution between the periodogram and a spectral window
k =q
Γˆ (ω j ) = ∑W
k =−q
D (ω k − ω j ; n, T ) I Tj

## where the spectral window is given by

⎧1 / n if | ω |≤ (n / 2T )
WD (ω k − ω j ; n, T ) = ⎨
⎩0 otherwise
or as the Fourier transform of the product of the estimated auto - covariance function c(τ ) and a lag window,
T −1

∑w
− 2πiω jτ
Γˆ (ω j ) = D (τ ; n, T )c(τ )e ,
τ = − (T −1)

sin(π nτ / T )
wD (τ ; n, T ) ≈
π nτ / T

## Other smoothed periodogram spectral estimators (e.g. the Bartlett and

Parzen estimators) can be represented similarly, but with different
spectral windows or lag windows
2.2 Time Series Analysis

Proof
1 j + ( n −1) / 2
Γˆ (ω j ) = ∑ ITk
n k = j −( n −1) / 2
1 j + ( n −1) / 2 τ =T −1
= ∑ ∑ c(τ )e − 2πiω kτ
n k = j −( n −1) / 2 τ = − (T −1)
τ =T −1
1 j + ( n −1) / 2 − 2πi (ω k −ω j )τ
∑ c(τ )e ∑ e
− 2πiω jτ
=
τ = − (T −1) n k = j −( n −1) / 2
τ =T −1

∑ c(τ )e
− 2πiω jτ
= wD (τ ; n, T )
τ = − (T −1)

with
1 j + ( n −1) / 2 − 2πi (ω k −ω j )τ
wD (τ ; n, T ) = ∑ e
n k = j −( n −1) / 2
n / 2T
T
≈ ∫ e − 2πiωτ dω
n − n / 2T
sin(π nτ / T )
=
π nτ / T
2.2 Time Series Analysis

## Summary of Various Spectral Estimators

The estimator can be expressed either in terms of a lag window wD or a
spectral window WD determined by a selectable number m,n or M:
q T −1

∑WD (ω k − ω j ; n, T ) ITj = ∑w
− 2πiω jτ
Γˆ (ω j ) = D (τ ; n, T )c(τ )e
k =− q τ = − (T −1)

## Chunk Daniell Bartlett Parzen

(m or M) (n) (M) (M)
Lag ⎧1 | τ |≤ M - 1* sin(πnτ / T ) ⎧ |τ | ⎧1 − 6(| τ | /M )2 + 6(| τ | /M )3 | τ |< M / 2
⎨ ⎪1 − | τ |< M ⎪
Window πnτ / T ⎨ M
⎨2(1- | τ | /M ) M / 2 ≤| τ |≤ M
3
⎩0 otherwise ⎪⎩0 otherwise ⎪0 otherwise

## Spectral ⎧1 / m ω = ω j* ⎧1 / n | ω |≤ n / 2T M ⎛ sin(πωM ) ⎞ 3M ⎛ sin(πωM / 2) ⎞

2 4

⎨ ⎨ ⎜ ⎟ ⎜ ⎟
Window ⎩ 0 otherwise T ⎝ πω M ⎠ 4T ⎝ πωM / 2 ⎠
⎩ 0 otherwise

where m is the number of chunks used by the chunk estimator, M is either the length of the chunk estimator or the
cutoff point of the Bartlett or Parzen lag windows, T is the length of the time series and n is the number of periodogram
ordinates that are averaged to produce the Daniell estimator. * indicates that the window is applied to the average of
estimators of auto-covariance function or spectrum computed from the individual chunks.
2.2 Time Series Analysis

## Graphic Summary of Various Spectral Estimators

The estimator can be expressed either in terms of a lag window wD or a
spectral window WD:
q T −1

∑WD (ω k − ω j ; n, T ) ITj = ∑w
− 2πiω jτ
Γˆ (ω j ) = D (τ ; n, T )c(τ )e
k =− q τ = − (T −1)

Chunk Chunk
Bartlett Daniell
Parzen Bartlett
Daniell Parzen
2.2 Time Series Analysis

## Equivalent Degrees of Freedom and Bandwidth of Smoothed

Periodogram Spectral Estimators

The problem:
Different from the Daniell estimator which places equal weight on a fixed number of
periodogram ordinates, the Bartlett and Parzen estimator do not weight the
periodogram ordinates equally. Consequently, the properties of the periodogram,
i.e. asymptotical independent and identically χ2(2n) distributed, cannot be directly
applied to the Bartlett and Parzen estimators to determine their bandwidth and
degrees of freedom.

The solution:
• The equivalent degrees of freedom r is found by matching the asymptotical mean
and variance of the spectral estimator with the mean and variance of a χ2 random
variable
• The equivalent bandwidth is r/2T
• Once the equivalent degrees of freedom is determined, confidence intervals can be
computed as described earlier
2.2 Time Series Analysis

## Various spectral estimates of the AR(2) process with (a1,a2)=(0.9,-0.8)

plotted on the decibel scale

## The cross in the upper right corner indicates the

width of the 90% confidence interval (vertical bar)
and the bandwidth (horizontal bar). The dashed
curve displays the true spectrum

Recommendation:
a smoothed
periodogram
estimator should
be used, with a
slight preference
for the Daniell and
Parzen estimators
over Bartlett
estimator.
2.2 Time Series Analysis

## Bias, Variance and Variance Leakage

Be aware of tradeoffs between bias and variance that implies dichotomy

•Variance leakage: the contamination of the spectral estimate by contributions from periodogram
ordinates at frequencies far remoted from the frequency of interests. This problem is particularly
severe for the Bartlett estimator, and is correctly in the Parzen estimator whose spectral window
has no side lobes
• The Parzen estimator has somewhat lower variance than the Bartlett estimator for the same lag
cutoff, since its spectral window has a wider central peak and thus larger bandwidth
• For the same reason, the Parzen estimator has somewhat more bias

## high bias high variance

2.2 Time Series Analysis

## Non-parametric Estimation of the Cross-correlation Function

The Estimator:
Let (x1,…,xT) and (y1,…,yT) represent realizations of two ergodic weakly stationary
processes {Xt} and {Yt} at T consecutive times, (x1’,…,xT’) and (y1’,…,yT’) the time
series of deviations from the sample means. The estimators of the cross-covariance
and cross-correlation functions are

⎧ 1 T −τ
⎪ T ∑ x't y 't +τ τ ≥ 0
⎪ t =T1
⎪1
c xy (τ ) = ⎨ ∑ x't y 't +τ τ < 0
⎪ T t =1+|τ |
⎪0 | τ |≥ T
⎪⎩
c xy (τ )
rxy (τ ) =
(c xx (0)c yy (0))1/ 2 The types of problems that
occur when estimating the
auto-correlation function
also occur when estimating
the cross-correlation
function
2.2 Time Series Analysis

## Estimating the Cross-spectrum

The estimation of the cross-spectrum between {Xt} and {Yt} is based on the cross-
periodogram
T
I xyT j = Z xT j Z *yT j
4

• The cross-periodogram has the same properties as the periodogram, i.e. its
variability cannot be reduced taking larger and larger samples

## • An additional difficulty is that the cross-periodogram produces degenerate

coherency estimates. It can be shown the coherency estimate equals
| I xyT |2
κˆ xy (ω j ) = j
=1
I xxT j I yyT j

## One MUST construct cross-spectral estimators that average

across a number of nearly independent realizations of the
cross-periodogram. The chunk and smoothed periodogram
estimators do exactly this!
2.2 Time Series Analysis

## Auto-Regressive Spectral Estimation

The Idea:
• Assume that the process is ergodic and weakly stationary
• fit an AR model of some order p which is chosen either objectively by means of a
criterion or subjectively
• estimate the spectrum with that of the fitted AR process described by the
estimated process parameters and the estimated noise variance

## The Theoretical Justification:

Any ergodic weakly stationary process can be approximated arbitrarily closely by an
AR process

•An AR model can provide a dynamical interpretation
• the spectral estimates are generally smoother than those made by smoothed
periodogram estimators
2.2 Time Series Analysis

## Maximum Entropy Spectral Estimation

(a particular form of AR-spectral estimation)

## Suppose one has available estimated auto-covariances c(0),…,c(M). Then the

maximum entrophy spectral estimator Γˆ (ω ) is the non-negative function that maximizes
the entrophy
1/ 2

∫ ln(Γˆ (ω )) dω
−1 / 2

1/ 2

∫ Γˆ (ω )e
2πiωτ
dω = c(τ )
−1 / 2

for τ=0,…,M.

## It can be shown that the above two equations

have a unique solution that is given by an AR-
spectral estimator in which an AR-model of order
p=M is fitted.