Vous êtes sur la page 1sur 45

Parametric Method of

Power Spectrum
Estimation PRESENTED BY:-

Duha Hassan
Power Spectrum of signal 2

 Power spectrum of signal gives the distribution of the


signal power among various frequencies. Power spectrum
is the Fourier transform of the correlation function it
describes the characteristics over time series in frequency
domain .
 So the power spectrum represent variance or power as a
function of frequency in the process and tell us where the
energy is distributed
Power Spectrum of signal 3

 Estimate the power spectrum given set of


data.
 If the signal is random ,then only an estimate
of the signal can be obtained.
WHY WE USE POWER SPECTRUM 4

ESTIMATION ?
 To estimate the spectral characteristics of signal characterized as
random processes.
 To estimation of spectra in frequency domain when signals are
random in nature.
 Power Spectral Estimation method is to obtain an approximate
estimation of the power spectral density of a given real random
process .
Introduction to Estimation 5

 Estimation theory is concerned with the determination of


the best estimate of an unknown parameter vector from
an observation signal , or the recovery of a clean signal
degraded by noise and distortion.
Parametric Modeling 6

Parametric modeling techniques find the parameters for a


mathematical model describing a signal, system, or
process. These techniques use known information about
the system to determine the model
 HANT 7
Model-Based Power Spectrum 8

Estimation
 parametric or model-based methods teqniques are based on the use of models for
the data
 Let assume that the data is the output for a linear time invariant system with
frequency response in response to a wait noise input sequins
 The assumption that the input has a flat spectrum implies that the power spectrum
of the model output is shaped entirely by the frequency response of the model
The input–output relation of a generalised discrete linear time-invariant 9
model is given by

where x(m) is the model output, e(m) is the input, and the ak and bk are the
parameters of the model. is known as an auto-regressivemoving-average (ARMA)
model.
 The system function H(z) of the discrete linear time-invariant model of Equation is
given by

where 1/A(z) and B(z) are the autoregressive and moving-average parts of H(z)
respectively.
10

The power spectrum of the signal x(m) is given as the product of the power
spectrum of the input signal and the squared magnitude frequency response of
the model:

where H(f) is the frequency response of the model and PEE(f) is the input
power spectrum.
model based approach

 In the model based approach, the estimation


procedure consists of two steps:-
Step 1:- estimate the parameters {ak} and {bk}
of the model.
Step 2:- from these estimates compute the
power spectrum estimate.
Parametric Method 12

 First step is to select an appropriate model for the process.


 This selection based upon:
 A-priori knowledge about how the process is generated
 Experimental results indicate that a particular model “works
well”.
 Models used are
 Autoregressive (AR)Model
 Moving Average (MA) Model
 Autoregressive Moving Average (ARMA) Model
Parametric Methods 13

 Once the model is selected, the next step is to estimate the model
parameters from the given data.
 The final step is the estimate the power spectrum by incorporating
the estimated parameters into the parametric form for the spectrum.
 Example: An ARMA(p , q) model with ap(k) and bq(k) estimated ,
the spectrum estimate would be
Power spectrum estimation (cont.) 14

 If the model is correct


• High quality spectrum estimates
• Significantly less data required
 If the model is wrong
• Parametric can given wrong / misleading estimation
Maximum–Entropy Spectral Estimation 15

Maximum entropy spectral estimation is a method of spectral density estimation.


The goal is to improve the spectral quality based on the principle of maximum entropy.
maximum-entropy estimate is based on the principle that the estimate of the
autocorrelation sequence must correspond to the most random signal whose correlation
values in the range | m |≤ P coincide with the measured values. The maximum-entropy
principle is appealing because it assumes no more structure in the correlation sequence
than that indicated by the measured data. In maximum entropy modeling, probability
distributions are created on the basis of that which is known, leading to a type
of statistical inference about the missing information which is called the maximum
entropy estimate.
power spectrum of a stationary signal is defined as the Fourier transform 16
of the autocorrelation sequence:

the maximum-entropy power spectrum may be expressed as

Equation shows that the maximum-entropy power spectrum estimate is the


power spectrum of an autoregressive (AR) model. Equation was obtained by
maximizing the entropy of the power spectrum with respect to the unknown
autocorrelation values. The known values of the autocorrelation function can be
used to obtain the coefficients of the AR model of Equation
17
What is AR Model

 A model which depends only on the previous


outputs of the system is called an autoregressive
model (AR).
 Note that:-
AR model is based on frequency-domain analysis
and should be windowed. (We use the hamming.)
AR model has only poles while the MA model has
only zeros.
19
20
The model parameters are found by solving a set of linear
equation obtained by minimizing the mean squared error.

The characteristic of this error is that it decreases as the order of


the AR model is increased.

 linear prediction models are often referred to as autoregressive (AR)


processes.
 signal can be predicted from its past samples depends on the
autocorrelation function, or equivalently the bandwidth and the
power spectrum, of the signal.
21
 A linear predictor model forecasts the amplitude of a
signal at time m, x(m), using a linearly weighted
combination of P past samples [x(m−1),x(m−2), ...,
x(m−P)] as

m: is the discrete time index


xˆ (m): is the prediction of x(m)
ak : are the predictor coefficients
A block-diagram implementation of the Equation 22
23
 The error e(m), defined as the difference between the actual sample
value x(m) and its predicted value xˆ (m) , is given by

u(m): is a zero-mean,
unit-variance random signal,

G: a gain term
is the square root of the variance of e(m):
24
 One of the most important consideration is the choice of the number 25
of terms in the AR model, this is known as its order p.
 If a model with too low an order,
We obtain a highly smoothed spectrum.
 If a model with too high an order,
There is risk of introducing spurious low-level peaks in the
spectrum.
The power spectrum of an autoregressive
process is given by

where e(m) is a random signal of variance σe2

The relation between the autocorrelation values and the AR model


parameters is obtained by multiplying both sides of Equation
by x(m-j) and taking the expectation:
Moving-Average Power Spectrum Estimation 27

A moving-average model is also known as an all-zero or a finite impulse response


(FIR) filter. A signal x(m), modeled as a moving-average process is described as

where e(m) is a zero-mean random input and Q is the model order.


The cross-correlation of the input and output of a moving average process is given by
the autocorrelation function of a moving average process is 28

the power spectrum of a moving-average process may be obtained directly from the Fourier
transform of the autocorrelation function as
29
Determine the mean and the autocorrelation of the sequence jc (n) generated by the
MA(2) process described by the difference equation
x(n) = w(n) - 2 w(n - 1) + w(n — 2) 30
where w(n) is a white noise process with variance a*.
Autoreg-ressive Moving Average 31
(ARMA)
The ARMA, or pole–zero, The relationship between the ARMA parameters and the
autocorrelation sequence can be obtained by multiplying both sides of Equation by x(m–j)
and taking the expectation

It consists of two parts, an autoregressive (AR) part and a moving average (MA) part.

can be used to obtain the coefficients ak


32
This process can be generated by filtering unit variance white noise with a
filter having both poles and zeros: the spectral estimate

where σe2 is the variance of


the input of the ARMA
model.

An autoregressive moving average (ARMA) process


has a power spectrum of the form:
ARMA model

 It is a tool for understanding and predicting the


future values in the series.
 It is usually referred to as the ARMA(p,q) model
where p is the order of the autoregressive part and
q is the order of the moving average part.
ARMA model

Itrequires fewer model parameters for


the spectrum estimation.
This model is appropriate when the
signal has been corrupted by noise.
Calculation of model parameters

 Consider a data sequence x(n) generated by AR


model.
 Let the output is corrupted by additive white noise.
 The Z-transform of the autocorrelation of the signal
is:-
 Relationship
between autocorrelation and
model parameters for ARMA(p,q) process
Matrix representation
 Matrix representation for m > p+q
It may be represented as:-

On minimizing, the result is:-


 From the AR model parameters, A(Z) can be
estimated by:-

 This yields the sequence


(cont.) 41

 In an ARMA model if A(z) = 1 then H(z) = B(z)


and the model reduces to moving average (MA)
process of order q.
 In an ARMA model if B(z) = 1 then H(z) = 1/A(z)
and the model reduces to autoregressive (AR)
process of order p.
Spectral Estimation by Autoreg-ressive Moving 42
Average (ARMA) (Parametric)

Has following advantages:


 Suitable for short data length.
 Gives better frequency resolution.
 Avoids spectral leakage.
High-Resolution Spectral Estimation 43

Based on Subspace Eigen-Analysis


 Eigen-analysis is used for partitioning the eigen vectors
 the eigen values of the autocorrelation matrix of a noisy signal
partition into two subspaces:
(a) the signal subspace composed of the principle eigenvectors
associated with the largest eigenvalues;
(b) the noise subspace represented by the smallest eigenvalues.
 The decomposition of a noisy signal into a signal subspace and
a noise subspace forms the basis of the eigen-analysis methods
44
Applications for parametric modeling include

speech and music synthesis,


data compression,
high-resolution spectral estimation,
 communications,
manufacturing, and simulation.
45

Thank you ^_^

Vous aimerez peut-être aussi