Académique Documents
Professionnel Documents
Culture Documents
Elena Punskaya
www-sigproc.eng.cam.ac.uk/~op205
A good text for the course: Monson H. Hayes, Statistical Digital Signal Processing and Modeling, Willey, 1996
family of functions (a single function is identified by the outcome k and it is just a function of, for example, time, where t = nT, and T the sampling interval)
Random Processes
Random Vector
1 2 . . .
sample space
n1 n2 n3
1/2
f()
10
11
Cross-Correlation function
12
Stationarity
13
Stationarity
Random vectors, see Fig.1
Joint pdf:
Random Vector
n1
n2
n3
14
Stationarity
15
Strict-Sense Stationarity
16
Wide-sense stationarity
17
Wide-sense stationarity
This only applies for finite variance processes: a SSS process with infinite variance is not WSS
18
19
cos() f() d = cos()(1/2)d =0 sin() = 0, sin(-) = 0 sin() f() d = sin()(1/2)d =0 E[sin()]= odd function
E[cos()]=20
sinAsinB=0.5[cos(A-B)-cos(A+B)]
fixed constants
cos(A+B)=cosAcosB-sinAsinB, where B = 2 and A fixed constant simplify and show as before E[cos(2)] = E[sin(2)]
=0
21
22
Power Spectra
23
24
Power Spectra
uT uT lT lT
2
25
26
Recall from Part 1B: Fourier series representation of a function of , C1m= exp(jm0) and C2m= exp(-jm0) E.g., C2m corresponds to impulse train of period 2 Sum of two impulse trains of period 2 C2m = (1/2)(-0)exp(-jm)d = = (1/2)exp(-jm0)
27
0 - 2
0 + 2
28
White Noise
[m] m
=cXX[0]
29
White Noise
rXX[m] = cXX[m] = X2[m]
30
31
fXn(xn) = N ( xn | = 0, X2)
Xn1 Xn2 Xn3
Random Vector
n1
n2
n3
32
33
Statistical characteristics are the same irrespective of shifts along the time axis. An observer looking at the process from sampling time n1 would not be able to tell the difference in the statistical characteristics of the process if he moved to a different time n2
34
36
37
38
39
40
41
42
43
44
45
m = -1,
m = 0, m =1
46
Minima
H(z) has a zero at z = -b0/b1=-1/0.9 Hence |H(ej) | has a minimum at = +2n maximum at =0 + 2n
Periodic
=
z-plane
47
48
49
50
51
Example
f
52
Example
53
Example
54
Ergodic processes
55
56
Summary
Looked at discrete-time random processes (most results for continuous time random processes follow through almost directly) Defined Correlation functions (auto- and cross-) Stationarity (strict sense and wide sense 3 conditions) Power spectrum (and calculation of autocorrelation function using the inverse DTFT) White noise (in particular, white Gaussian noise) Ergodic processes Linear system and a wide-sense stationary process Revision: Continuous time random processes see handouts, not covered during lectures
57
Optimal Filtering
58
observed
+
unobserved
H(z)
59
60
H(z)
61
62
63
error
H(z)
64
65
66
67
69
70
71
72
73
74
75
1
and are real and positive
76
1 1+
1 1 1+ SNR()
Signal-to-noise ratio
77
Example: AR Process
A 1st order all-pole, also known as, autoregressive process (AR), is generated by passing a zero-mean white noise through a first-order all-pole IIR filter
H(z) =
1 1 - z-1
2
78
Example: AR Process
79
Example: AR Process
80
Example: Deconvolution
Consider the following process
81
Example: Deconvolution
82
Example: Deconvolution
and take DTFT of rxd and rxx to obtain the Wiener filter
83
84
85
86
87
Positive definite
88
89
Example: AR process
Consider 1st order AR process which has autocorrelation function
Example: AR process
91
Example: AR process
-1 dd dd
1+v2
1+v2
-1
92
xn=dn+vn
93
large Also assume that a long segment of vn is available during a silent section of music/speech
vv
v v
94
95
96
In fact, audio signals are non-stationary Thus, need to apply to short quasi-stationary batches of data, one by one Can also successfully implement a frequency domain version using FFT
97
98
99
100
ARMA modelling
101
ARMA modelling
102
ARMA modelling
103
ARMA modelling
104
ARMA modelling
105
Autoregressive models
106
AR model
107
108
109
110
Yule-Walker Equations
111
Yule-Walker Equations
112
Yule-Walker Equations
113
Yule-Walker Equations
114
Yule-Walker Equations
`
115
Yule-Walker Equations
116
117
large
118
6.14 3.08
3.08 6.14
120
Thank you!
121