Vous êtes sur la page 1sur 17

Outline

Stochastic Processes

M. Sami Fadali
Professor of Electrical Engineering
University of Nevada, Reno

Stochastic (random) processes.


Autocorrelation.
Crosscorrelation.
Spectral density function.

Deterministic vs. Random Signals


,

, , ,

0.05

May have a deterministic structure.


cos

e. g. 0,1 ,

,
N 0,1 ,

-0.05

, , R,
Z

0.15

0.1

Random Signal: Associated with a chance


occurrence.
a) Continuous or discrete (time series).
b)

Example: No deterministic
structure.
X(t)

Deterministic Signal: Exactly predictable.


cos

-0.1

(integer)
0

10

t
4

Random Processes

Random Process

Map the elements of the sample space


to the set of continuous time functions .
For a fixed time point = random variable.
Example: Measurement of any physical
quantity (with additive noise) over time.

5
4

Sample
Function

3
2
1
0

-1

=ordinary time function

-2

4.5

-3
4
-4

=random variable

3.5
0

10
20

30

2.5

Random Sequence

Example: Random Binary Signal

Map the elements of the sample space to


.
the set of discrete time functions
For a fixed time point = random variable.
Example: Samples of any physical quantity
(with additive noise) over time.
Discrete random process, time series.

1.
2.
3.
4.

Random sequence of pulses s.t.


Rectangular pulses of fixed duration T.
Amplitude +1 or 1, equally likely.
Statistically independent amplitudes.
Start time for sequence uniformly
distributed in the interval [0, T].

Mathematical Description

Random Binary Signal

2T

= unit amplitude pulse of duration .


= binary r.v. in
= amplitude of
pulse.
= random start time, uniformly
distributed in
.

2T
D

10

Moments of Random Process

Properties of Binary Signal

Fix time to obtain a random variable.

Obtain moments as a function of time.

11

Second moment =variance (zero mean).


Special Case: Constant moments (not
functions of time).
12

Statistically Independent
Random Signals

Joint Densities

Specify how fast


time

changes with

Later: related to spectral content.


Higher order densities provide more
information (hard to compute).

Any choice of

Possibly

and

13

Gaussian Random Process

14

Autocorrelation

All density functions (any order) normal.


Multivariate normal density: completely
specified by the mean and covariance
matrix.

15

Autocorrelation is an ensemble average


using the joint density function.
Recall: for fixed , random process = r.v.
Similarly define, autocovariance (same as
autocorrelation for zero mean).
16

Autocorrelation of Binary Signal

Choose 2 time points

.
D = random start time, uniformly
distributed in
.

Switching point
same pulse.

2T

2T

in the

Switching point
same pulse.

D
1

not in the

1
2
probability =

in same interval
, with equal

17

Probabilities

2T

18

D = uniformly distributed in [0,T].

2T
D
1

1
probability =

2
2

1/T

in same interval
, with equal

D
T

19

20

Autocorrelation |t2t1| < T

Product of Amplitudes

Sign of amplitude is the same if in the


same pulse
.

Use

Sign of amplitude can be the same or


different if in consecutive pulses
.

in same pulse as

to include

|t2 t1|
|t2 t1|
|t2 t1|

Two cases ( ) of equal probability (0.5).

|t2 t1|

21

22

Autocorrelation |t2t1| > T

Autocorrelation of Binary Signal

Independent amplitudes for |t2 t1| > T


Independent implies uncorrelated.
1
0.9

Function of

0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
-6

23

-4

-2

2 1
24

Strictly Stationary

Stationary Random Process

All pdfs describing the process are


unaffected by any time shift.

Two definitions

Strictly stationary random process

Both governed by the same pdfs

Wide sense stationary random


process (WSS)

(Strictly) Stationary implies wide-sense


stationary.

Wide Sense Stationary: Constant mean,


shift-invariant autocorrelation.

25

Wide Sense Stationary


Random Signal

Nonstationary Signal

Stationarity of the mean (constant).

Stationary of the autocorrelation.

(depends on separation

26

Y(t )= X+cos(t), X~N(0,9), mY(t) = cos(t)

only)

27

28

Example: WSS only

Not Strict Sense Stationary

Equally probable outcomes

Take two time points (say


Values at

Joint pdf: Assume four possible joint outcomes


only (equally probable) with two sines or two
cosines of the same sign

and

Values at
First-order distributions are different
(even though their mean is the
same).

29

Ergodic Random Processes


Single realization is enough.
Time average = ensemble average.
Ergodicity, like stationarity, is an
idealization.
Can be approximately true in practice.
Ergodic signals tend to look random.

30

Stationarity Necessary
Explanation
Single realization has a single
average for any property: all
moments, autocorrelation etc.
Can only obtain expected value if
it is constant.

32

Ergodicity in Mean

Example: Stationary not Ergodic

Random constant amplitude

Amplitude

Sample realization with amplitude

Time Average

Mean of constant
Stationarity is not sufficient.

Ergodicity in the mean:


, as

For zero mean

, as

33

Ergodicity in Autocorrelation

34

Example

Time Autocorrelation

Deterministic structure

N(0, 2 ), constant
Sample realization

, fix
Compare time autocorrelation &
autocorrelation.

Ergodicity in autocorrelation: Need 4th


moment.
as
35

36

Time Autocorrelation

Autocorrelation

1 T
X A (t ) X A (t )dt
T T 0
1 T
_| A12 sin t sin t dt
T T 0
A12 T
cos cos 2t dt
_|
T 2T 0
A2
(finite integral)/T
1 cos
2

e X A ( ) _|

cos

cos

cos 2

sin

Not a function of the time shift only.

Not equal to time autocorrelation.

Not an ergodic process.

sin 2
37

38

Mean Square Value

Properties of Autocorrelation

From autocorrelation

Useful general properties used


throughout the course.
Several apply to the stationary
case only.

For

stationary

Assume real scalar processes.

39

40

Even Function

Peak Value

Consider a zero-mean process w.l.o.g.


(subtract mean if not zero)

Stationary

=even function of
41

Periodic Component (cont.)

Periodic Component

42

has
A periodic component of the same period.
No information about phase.
=zero-mean
+ uncorrelated

= Sum of uncorrelated terms


43

44

Periodic Component (cont.)

Zero-mean
Zero-mean, ergodic process

has
A periodic component of the same
period.
No information about phase.
45

Autocorrelation for Vector

46

Crosscorrelation Function

=conjugate transpose
=transpose for real
Stationary:

Stationary: skew-symmetric

47

48

Properties of Crosscorrelation

Example: Cross-correlation

Zero at

Zero mean as
shown earlier

not maximum

49

50

Combination of Random Processes

Time Delay Estimation


1

Sum

of two random processes.

0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1

Uncorrelated and at least one zeromean.

51

Transmit signal to object.

Receive signal.

Find cross-correlation of two signals.

Location of peak = time delay.

0
-8

-6

-4

-2

Td

52

Power Spectral Density


Function (SDF)

Example

Stationary Process: Fourier transform


of autocorrelation.
L

Wiener-Khinchine Relation
_

Same as Gauss-Markov process


but autocorrelation does not tell
the whole story.

53

Autocorrelation and PSD

54

Mean-square Value of Output

For = 0

55

56

Power in a Finite Band

Example

Power spectral density gives


distribution of power over
frequencies.

Integration tables.

57

Example: Mean Square

58

Cross Spectral Density Function

Skew-symmetry of cross-correlation:
=

Later: Table 3.1 for s-domain integral.


59

60

Properties of SDF

Spectral Density of Sum

The autocorrelation is real, even.

Real function

Even function

+
+

Nonnegative: since its integral gives


the energy over BW.

Zero cross-correlation:
+
&

uncorrelated &

61

or

zero mean.
62

Coherence Function

Conclusion
Probabilistic description of random
signals.
Autocorrelation and crosscorrelation.
Power spectral density function.
Experimental determination:
necessary but difficult.

Used in applications (e.g. acoustics).

63

64

References
R. G. Brown and P. Y. C. Hwang, Introduction to
Random Signals and Applied Kalman Filtering,
3ed, J. Wiley, NY, 1997.
G. R. Cooper and C. D. MaGillem, Probabilistic
Methods of Signal and System Analysis, Oxford
Univ. Press, NY, 1999
Binary signal example is from
K. S. Shanmugan & A. M. Breipohl, Detection,
Estimation & Data Analysis, J. Wiley, NY, 1988,
pp. 132-134.
65

Vous aimerez peut-être aussi