Académique Documents
Professionnel Documents
Culture Documents
- 1
CME620
2016
Week
1
f X ( x)
1
2
b g
0607
. a
X ~ N ,
3
px(k)
Lecture Guide
Topics
1.
2.
3.
4.
Course Introduction
Set Theory and Venn Diagrams
Unions, Intersections, Compliments, etc.
Probability Theory
Probability Space and Probability Measure
Axioms of Probability
Conditional Probability
Independence of Events (mutually exclusive events)
Partition- Law of total probability
Bayes' Rule
2
1. Definition and Characterization of One Random Variable
Probability Distribution Function (cdf) and their properties
Probability Density Function (pdf) and their properties
Probability Mass Function (pmf) and their properties
2. Conditional distributions and densities
3. Important Random Variables (Discrete and Continuous)
Discrete Binomial, Bernouli, Poisson, Hypergeometric,
- Uniform,
Exponential,
Rayleigh, Nakagami, 2
(c) Prof. Okey Ugweje Continuous Federal
University
of Technology,Gaussian,
Minna
Set Theory - 1
Set Theory - 2
Definition:
A set is a collection of distinct objects called elements
Usually written as a list of elements enclosed in
brace { }
Since elements must be distinct, 2 or more
elements in a set cannot be the same
Example 1:
{1,2,3} is a valid set whereas {1,1,3} is not
Set can be made up of elements which are
themselves sets
Set can be finite or infinite
Example 2:
The set of all positive integers {0,1,2,3, } is countably
infinite, whereas the set of all real number [0,1] is
uncountably infinite
All sets are subsets of the sample space
Definition:
The union of two sets A and B (denoted as A B) is a
set that contains all elements in either A or B
A B | A or B
For more than two elements
Set Theory - 3
Example 3:
If A = {1,2,4}, and B = {1,3,5}, then A B = {1,2,3,4,5},
Definition:
A set A is a subset of a set B (denoted as A B) if all the elements
of the set A are also in the set B.
Example 5:
Set A = {1, 2} is a subset of set B = {1, 2, 3, 5}
Definition:
The intersection of two sets A and B (denoted as A B) is a
set that contains only the elements that appear in both sets
A B | A and B
n
i 1
i 1
Ai Ai
Example 6:
If ={1, 2, 3, 4, 5}, the complement of the set B = {1, 2, 3},
is the set Bc = {4, 5}
Example 4:
If A = {1, 2, 4}, and B = {1, 3, 5}, then A B = {1},
(c) Prof. Okey Ugweje
Set Theory - 4
n
n
Ai Ai
i 1
i 1
Set Theory - 5
Set Theory - 6
Set Operators:
= universal set
= null set
= union
= intersection
, = subsets
element of
Venn Diagrams - 1
10
Venn Diagrams - 2
Department of Telecommunications Engineering
= S
Ak A1 A2 Ak ... Ak
k 1
k 1
A A
A A A
A S S
A A S
A B C A B C
A B A if B A
11
12
Venn Diagrams - 3
Venn Diagrams - 4
Intersection (Product)
Elements common to all sets
Elements contained in all sets
Events occur in all experiment
Series systems
A
B
C
AB
Ak A1 A2 Ak Ak
k 1
k 1
S
A
n
n
Ak A1 A2 Ak Ak
k 1
k 1
ABC
13
A A A
A S A
A A
A B C A B C
Venn Diagrams - 5
i 1
A1
B
Aj
A2
Ai
An
A A S
A A
A A
A B A B
A B A B
B
Ac
S , S
Difference
A B
Ac
14
Venn Diagrams - 6
mutually
exclusive
B
B-A
S
15
16
Venn Diagrams - 7
Subsets
B
A
S
B
C
EF
AB
ABC
De-Morgan's Law
A B A B ;
A
A B
(c) Prof. Okey Ugweje
A B
Ec
A B A B
A
E G
A
n
A B
i 1
17
i 1
F G
Aic ;
k 1
E F G
E G F G
Bk Bkc
k 1
Federal University of
18 Technology, Minna
Example 8
Federal University of
19 Technology, Minna
Page 18
Page 19
PZ
NZ
NS
25
130
0. 192
Federal University of
20 Technology, Minna
Page 20
Probability Theory - 1
Department of Telecommunications Engineering
21
Probability Theory - 2
22
Some Applications - 1
For example
Data sent through a communication system is
random since the outcome at the receiver is not
certain
Noise, interference and fading introduced by the
channel are random processes and can only be
modeled as such
The measure of performance (e.g., Bit Error Rate)
is probabilistic since it is an estimate of the received
signal compared to the transmitted signal
System
Output Signal
23
24
Some Applications - 2
Some Applications - 3
s(t) + n(t)
System
Output Signal
Input
Output Signal
Noise n(t) is almost always random in nature and calls for the
use of probabilistic methods even if the signal s(t) is not, e.g.,
Thermal noise
Thermal motion of the conduction electrons in the amplifier
input circuit
Random variations in the number of electrons (or holes)
passing through a transistor
Since there are millions of electrons, one cannot calculate
the value of this kind of noise at every instant of time, but
can calculate:
Noise n(t) is almost always random in nature and calls for the
use of probabilistic methods even if the signal s(t) is not, e.g.,
Thermal noise
Thermal motion of the conduction electrons in the amplifier
input circuit
Random variations in the number of electrons (or holes)
passing through a transistor
Since there are millions of electrons, one cannot calculate
the value of this kind of noise at every instant of time, but
can calculate:
25
Some Applications - 4
26
Some Applications - 5
Quality Control
An important method of improving system reliability
is to improve quality of the individual elements.
This is often done by an inspection process since it
will be too costly to inspect every element
27
28
Some Applications - 6
Department of Telecommunications Engineering
Probability Concepts
We see that the theory of probability is at heart
only common sense reduced to calculations ...
- Laplace Pierre Simon
29
Probability Concepts
Probability Spaces
30
31
32
Probability Spaces
33
Tossing of 2 Dice
a) Dice are distinguishable
S1 = {(1,1), (1,2), , (1,6); (2,1); (2,2), , (2,6);
(3,1); (3,2), , (3,6); (4,1); (4,2), , (4,6);
(5,1); (5,2), , (5,6); (6,1); (6,2), , (6,6)}
= {6}+{6}+{6}+{6}+{6}+{6} = 36 elements (or 62)
b) Dice are indistinguishable
S2 = {(1,1), (1,2), , (1,6); (2,1); (2,2), , (2,6);
(3,1); (3,2), , (3,6); (4,1); (4,2), (4,3), , (4,6);
(5,1); (5,2), , (5,6); (6,1); (6,2), (6,3), (6,4), (6,5), (6,6)}
= {6}+{5}+{4}+{3}+{2}+{1} = 21 elements
c) May also use Tabular method
(c) Prof. Okey Ugweje
34
Event - 1
Department of Telecommunications Engineering
A
Definition:
An Event, A, is a set of outcomes;
a subset of the sample space
Event is any possible outcomes of
an experiment. It is the simplest random phenomenon
Event is usually known as the information space
Each Event has associated quantity which characterizes the
objective likelihood of occurrence of that event
That quantity is the probability of the event
(c) Prof. Okey Ugweje
36
Example 11 - Events
Event - 2
Special events
There are two special events of interest:
1) Universal Set ( or S)
Set containing all elements
The totality of all elementary event i, known a priori,
1 , 2 , , k ,
38
Definition of Probability
Department of Telecommunications Engineering
Axioms of Probability
The theory of probability as a mathematical
discipline can and should be developed from axioms
in exactly the same way as geometry and algebra.
Andrey Kolmogorov
(c) Prof. Okey Ugweje
39
40
Axioms of Probability - 1
Axioms of Probability - 2
Note:
(iii) states that if A and B are mutually exclusive (M.E.)
events, the probability of their union is the sum of their
probabilities, i.e.,
P A B P A P B ,
If A and B cannot occur simultaneously
This is the minimum number of axioms required to
establish the remaining concept of probability.
These axioms allow us to view events as object
with properties.
k 1
k 1
P[ Ak ] P[ Ak ]
(c) Prof. Okey Ugweje
41
Axioms of Probability - 3
P A A P 1 .
P A A P A P A 1
A B A AB ,
where A and A B are clearly M.E. events.
P A 1 P A .
Federal University of Technology, Minna
42
Axioms of Probability - 4
b) Similarly, for any A, A .
Hence it follows that P A P ( A ) P ( ) .
But A A , and thus P 0 .
or
43
A B
44
Axioms of Probability - 5
Axioms of Probability - 6
( B A) ( B A) BA B A
Corollary 1:
P B P ( BA) P ( B A),
P[Ac] = 1 - P[A]
P[S] = P[A Ac] = P[A] + P[Ac] = 1
Thus P AB P ( B ) P ( AB )
since BA AB and B A AB are M.E. events.
Hence,
P A B P ( A) P ( B ) P ( AB ).
and using other relations
(c) Prof. Okey Ugweje
Corollary 2:
45
Axioms of Probability - 7
46
Axioms of Probability - 8
Corollary 3:
Corollary 5:
P[AB] = P[A] + P[B] - P[A B]
AcB
Corollary 4:
k 1
A B = A (Ac B)
P[A B] = P[A] + P[Ac B]
P[A] = P[A B] - P[Ac B];
Substituting in will yield
P[A B] = P[A] + P[B] - P[A B]
AB
A+B
PL A O P[ A ], n 2
MN PQ
k 1
47
B = (A B) (Ac B)
P[B] = P[A B] + P[Ac B]
P[Ac B] = P[B] - P[A B]
48
Axioms of Probability - 9
Axioms of Probability - 10
P[ABC] = P[(AB) C]
= P[AB] + P[C] - P[(AB) C]
= P[A] + P[B] - P[AB] + P[C] - P[AC] - P[BC]
= P[A] + P[B] + P[C] - P[AB] - P[AC] - P[BC]
+ P[ABC]
49
Axioms of Probability - 11
Example 12
Corollary 6:
In general, for n events,
n
n
P A P[ Ak ] P[ A j Ak ] ... ()n 1 P[ A1 ... An ]
k 1 k k 1
jk
P[ A j ]
j 1
Corollary 7:
If A B, then P[A] P[B]
B = A (Ac B)
AB
P[B] = P[A] + P[Ac B]
A
B
S
P[A], since P[Ac B] 0
These axioms and corollaries provide us with the rules (or
law) for computing the probability of events
c
50
51
52
Example 13
Probability Problems - 1
53
Probability Problems - 2
Department of Telecommunications Engineering
Probability Problems - 3
Department of Telecommunications Engineering
Federal University of
54 Technology, Minna
y
x
y
x
Federal University of
56 Technology, Minna
Probability Problems - 4
Example 14
L( A)
L(S ) , Length
( A)
, Area
P[ A]
(S )
V ( A)
V (S ) , Volume
Solution
Let S = {all possible occurrence}={36}
(a)
(b)
E ={sum is 8}
F ={sum is 5}
T ={sum is 7}
5
4
6
P E ; P F ; P T ;
36
36
36
Federal University of
57 Technology, Minna
4
F
F
F
F
T
T
E
T
E
T
E
6
T
E
T
E
58
Example 16
Department of Telecommunications Engineering
1 2 3 4 5
1
O
O
2 O E O E O
3
O
O
4 O E O E O
5
O
O
6 O E O E O
6
O
E
O
E
O
E
18
P O
36
4 6 5 15
36 36 36 36
Example 15
Department of Telecommunications Engineering
(b)
1
2
3
4
5
6
P F , T or E P F T E P F P T P E
59
2
2
P A B ; P B C .
8
8
1
P A B C ;
8
(c) Prof. Okey Ugweje
1
2
3
4
5
6
7
8
X
0
0
0
0
1
1
1
1
Y
0
0
1
1
0
0
1
1
Z
0
1
0
1
0
1
0
1
OUTCOMES
B
B
A
A
A
A
B
B
C
C
60
Conditional Probability - 1
Department of Telecommunications Engineering
Conditional Probability
Theory
We define
61
Conditional Probability - 2
P (( A C ) B ) P ( AB CB )
.
P(B)
P( B)
But AB BC , hence P ( AB CB ) P ( AB ) P ( CB ).
P[A|C]
P[C]
P ( AB ) P (CB )
P ( A | B ) P (C | B ),
P( B)
P( B)
satisfying all probability axioms.
P( A C | B)
P[D]
Properties:
BC
P[A|D]
AD
P[B|D]
BD
P AB
P B
1,
P B
P B
P[B|C]
P[A/B] is small
AC
1. If B A, AB = B, then P A|B
62
P( A C | B)
provided P(B) 0.
Conditional Probability - 3
P[A|B] is large
P[ AB ]
,
P[ B ]
1,
since B = B.
(ii)
P[ B ]
P[ B ]
P[ A | B ]
63
64
Conditional Probability - 4
Conditional Probability - 5
2. If A B, AB = A, and
P AB
P A
P A|B
P A ,
P B
P B
Ai .
i1
65
i 1
Example 17
Solution
1
3
P[ A B ]
4
a) Find P[A|B] P[ A | B]
P[ B ]
c) Find P[AB]
67
P[ A B ] P[ A] P[ B ] P[ A B ]
1
P[ B A]
1
4
1
2
P[ A]
2
b) Find P[B|A] P[ B | A]
d) Find P[Ac|Bc]
66
Example 17
1 1 1 7
2 3 4 12
P[ Ac | B c ]
P[ Ac B c ]
P[ B c ]
Federal University of
68 Technology, Minna
Example 17
Department of Telecommunications Engineering
But
P[ B c ] 1 P[ B ] 1
Example 18
Department of Telecommunications Engineering
1 2
3 3
A B c Ac Bc P Ac Bc P A B c
7
5
c
P A B 1 P A B 1
12 12
e) Find P[Bc|Ac]
P[ B c | Ac ]
5
5
P[ B c Ac ]
12
c
1
6
P[ A ]
2
Federal University of
69 Technology, Minna
Federal University of
70 Technology, Minna
Example 18
Department of Telecommunications Engineering
Example 18
Department of Telecommunications Engineering
Let
C+ = {has Cancer};
C- = {no Cancer};
R = {positive reaction}
Therefore,
P[R|C+] = 0.9; P[R|C-]= 0.05;
P[C+] = 0.01; P[C-] = 0.99
P R | C
P R | C
(c) Prof. Okey Ugweje
P S
Federal University of
71 Technology, Minna
P S
P C | R) P R
P R | C P C P R | C P C
(0.9)(0.01)
(0.9)(0.01) (0.05)(0.99)
0.154
S R C R C
P C | R) P R
P C | R) P R
P R | C
(c) Prof. Okey Ugweje
P C | R) P R
P S
Federal University of
72 Technology, Minna
Example 18
Department of Telecommunications Engineering
P R | C
P R | C ) P R
P S
P R | C ) P R
Independence
P R | C P C P R | C P C
(0.9)(0.01)
(0.9)(0.01) (0.05)(0.99)
0.154
Paul Harvey
Federal University of
73 Technology, Minna
Independence - 1
Independence - 2
P [ AB ] P [ A ] P [ B ]
It is easy to show that if A, B are independent, then
AB ; A , B ; A , B
are all independent pairs.
74
75
P[ A | B ]
P[ A B ]
P[ A ]P[ B ]
P[ A]
P[ B ]
P[ B ]
76
Independence - 3
Example 19
P[ A B ] P[ A | B ]P[ B ]
P[ A B ] P[ B | A]P[ A ]
P[ B A]
P[ A B ]
,
P[ A]
P[ A]
P [B |A [
P [A |B ]=
P [A ] B a y e s ' T h e o r e m
P [B ]
P[ B | A]
suit
Diamond
suit
Spade
suit
Heart
suit
10
11
12
13
Club
Jack
King
Queen
Ace
(c) Prof. Okey Ugweje
77
Federal University of
78 Technology, Minna
Example
Example
For each suit the sample space consist of ace, two, ...,
ten, jack, queen, king and it is indicated as {1, 2, ..., 13}
Let A = {king is drawn}, B = {club is drawn}
Describe the events
a) A B = {either king or club (or both i.e., king of
clubs)}
b) A B = {both king and club (king of clubs)}
c) Since B = {clubs}, Bc, = {not club} = {hearts, diamond,
spade}.
Federal University of
79 Technology, Minna
Page 79
Federal University of
80 Technology, Minna
Page 80
Example
Example
Solution:
P[ A]
Also
4
8
13
; P[ B ] ; P[C ] ;
52
52
52
P[ A B ] 0
Therefore
1
2
P[ A C ] ; P[ B C ] ;
52
52
1
1
P[ A] P[C ] ;
52
52
2
2
P[ B C ]
P[ B] P[C ] ;
52
52
P[ A C ]
32
;
52 52
Federal University of
81 Technology, Minna
Federal University of
82 Technology, Minna
Partition Law
Bayes Rule
Laws of Total Probability
Introduction to Markov Chains
Counting Techniques
Sampling of Different Kinds
1. Sampling with replacement and with ordering
2. Sampling without replacement and with ordering
3. Sampling without replacement and without ordering
4. Sampling with replacement and without ordering
83
84
Partition - 1
Department of Telecommunications Engineering
Partition
(Law of Total Probability)
85
Partition - 2
86
Partition - 3
...
B3
B1
Bn-1
...
B2
Bn
i.e.,
A =A S
= A (B1 B2 ... Bn )
= (A B1) (A B2) ... (A Bn )
k 1
P[A B] = P[B|A]P[A],
we may write probability as follows
P A P A B1 P A B 2 P A B n
P[
or equivalently
A B k ]
P A Bk
n
P[ A| Bk ]P[ Bk ]
k 1
k 1
87
88
Partition - 4
Example 20
Hence
P[ A] P[ A| Bk ]P[ Bk ]
k 1
k 1
k 1
P[ A] P A Bk P A| Bk P Bk
P[ A] P[ A| Bk ]P[ Bk ]
(c) Prof. Okey Ugweje
89
Federal University of
90 Technology, Minna
Example 20
Department of Telecommunications Engineering
P E P E | E1 P E1 P E | E2 P E2
P E | E3 P E3 P E | E 4 P E 4
0.50 0.3 0.30 0.25
0.10 0.25 0.02 0.20
0.254
Bayes Rule
92
Bayes Rule - 1
Bayes Rule - 2
Bayes Rule:
If the events B1, B2, , Bn constitute a partition of the sample
space S such that P[Bk] 0, k=1,2, , n, then for any event A in
S such that P[A] 0,
P Bk | A
P A Bk
P A| Bk P Bk
n
P[ A]
P[ A| Bk ]P[ Bk ]
k 1
Proof:
By definition of conditional probability
P A Bk
P Bk | A
PA
and then using partition law or total probability law for the
denominator, we obtain
P Bk | A
P A Bk
k 1
P[ A Bk ]
93
94
Markov Chains
Our brains are just not wired to do probability
problems very well.
Persi Diaconis
(c) Prof. Okey Ugweje
95
96
ij
P01
P10
PM 0
P11 P1M
PM1 PMM
Pij i = 0,1, 2,
j 0
The values
Pij P Xn 1 j| Xn i , i, j 0,1, 2,
P X n jn | X n 1 jn 1,, X 1 j1, X 0 j0 ,
Pjn i jn P X n 1 jn 1,, X 1 j1, X 0 j0 ,
P0 M
P00
97
Example 28
98
Example 28
Solution
The sample space of this experiment consists of sequences
of 0s and 1s.
Each possible sequence corresponds to a path through the
"trellis" diagram shown. The nodes in the diagram denote
the box used in the nth sub experiment, and the labels in the
branches denote the outcome of a sub experiment. Thus the
path 0011 corresponds to the sequence:
99
The coin toss was heads so the first draw was from box 0;
the outcome of the first draw was 0, so the second draw was
from box 0; the outcome of the second draw was 1, so the
third draw was from box 1; and the outcome from the third
draw was 1, so the fourth draw is from box 1.
Federal University of Technology, Minna
100
Example 28
Department of Telecommunications Engineering
Find P[0011] ?
Counting Techniques
But to us, probability is the very guide of life.
Bishop J. Butler
P 0011 P 1 |1 P 1 | 0 P 0 | 0 P 0
5 / 6 1 / 3 2 / 3 1 / 2
(c) Prof. Okey Ugweje
101
Counting Techniques - 1
102
Counting Techniques - 2
103
104
Counting Techniques - 3
Counting Techniques - 4
105
Counting Techniques - 5
1
nk
106
Counting Techniques - 6
107
a11, a12 , ,
a1n1
a k1, a k 2 , , a knk
Federal University of Technology, Minna
108
Counting Techniques - 7
Counting Techniques - 8
P( n,n ) n( n 1 )( n 2 )( n n 1 ) n !
P( n, k )
n!
( n k )!
P( n, k ) n ( n 1 )( n 2 )( n k 1 )
st
Total elements in the last experiment
Total elements in the 1
experiment
[ n( n 1 )( n 2 )( n k 1 )]( n k )!
( n k )!
109
Counting Techniques - 9
110
Counting Techniques - 10
n
n! ~ n
2n or n! ~ 2 nn 1/ 2en
e
n!
1
lim
n
n
n
2n
e
0! 1
ej
P 25, 2
ej
25!
25 24 600
25 2 !
112
Counting Techniques - 11
Counting Techniques - 12
Department of...Telecommunications Engineering
Multinomial Coefficient
Suppose n distinct elements are divided into k
different groups (k 2), for j = 1, , k, the j-th group
contains exactly nj elements where n1 + n2 ++ nk = n
We want to determine the number of ways in which
the n elements can be divided into k groups, i.e,
How many ways can k distinguishable balls be
distributed into n different boxes so that there are ni
balls in box i?
n
n1
n1
n n1
n2
n2
nk 1 nk
n n1 n2
n3
nk 1
n3
nk 1
Hence
n , n ,..., nk
P 1 2
n!
n1 !n2 !...nk !
n
n
n
n
,
,...,
k
1 2
113
Counting Techniques - 13
Definition: For any number x1, x2, , xk and any positive integer
n,
m
k k
n!
k
x1 x k n
x11 x22 xmm
i k1 ! k2 !km !
F n I F n I F n n I F n n n I F n I n!
GH k , k , , k JK GH k JK GH k JK GH k JK GH k JK k !, k !, k !
1
n!
n1 !, n2 !, , nk !
F n I
GH n , n , , n JK
1
114
Counting Techniques - 14
nk
nk
nk
n n n1 n n1 n2 nk 1 nk
N ( s)
n1 n2 n3 nk 1
115
n!
n1 !, n2 !, , nr !
116
Counting Techniques - 15
Counting Techniques - 16
xx
1
xxxxxx
2
...
x
n
N ( S)
FG IJ
HK
n
P(n, k )
n!
Ck C(n, k )
n
k
k!
k !(n k )!
(c) Prof. Okey Ugweje
117
Examples
118
Repeated Trials - 1
**
119
120
Bernoulli Trial - 2
Bernoulli Trial - 3
Suppose for a given n & p we want to find the most likely value of k?
From Fig. below, the most probable value of k is that number which
maximizes Pn(k).
k 0
k (1 p ) ( n k 1 ) p
k ( n 1) p .
k ( n 1) p
n
P ( X k ) p k q n k .
k 0 k
n
p 1 / 2.
Pn ( k ) Pn ( k 1 ),
n 12,
( n k )! k !
Pn ( k 1)
n! p k 1 q n k 1
k
q
.
( n k 1)! ( k 1)! n! p k q n k
Pn ( k )
n k 1 p
Thus
if
or
Pn (k )
121
Bernoullis Theorem - 1
122
Bernoullis Theorem - 2
pq
(
)!
(
1
)!
(
n
k
)!
(
k
2
)!
n
k
k
k 1
k 2
k 0
p
P
n
n!
p k q n k n 2 p 2 npq .
k 1 ( n k )! ( k 1)!
n
(
)
p k q n k
p
q
k
P
k
k
n
( n k )! k!
k 1
k 1 ( n k )! ( k 1)!
k 0
n 1
n 1
( n 1)!
n!
p i q n 1i
p i 1q n i 1 np
(
1
)!
!
(
)!
!
n
i
i
n
i
i
i 0
i 0
Note that
k
p
n
( k np )
k 0
Pn ( k )
n
k 0
2 Pn ( k ) n 2 2 .
#*
( k np )
k 0
np ( p q ) n 1 np .
(c) Prof. Okey Ugweje
is equivalent to ( k np ) 2 n 2 2 ,
Pn ( k )
k 0
Pn ( k ) 2 np k Pn ( k ) n 2 p 2
k 0
123
124
Bernoullis Theorem - 3
Bernoullis Theorem - 4
( k np )
k 0
Pn ( k )
( k np )
k np n
( k np )
k np n
Pn ( k )
( k np )
k np n
Pn ( k ) n 2 2
n 2 2 P k np n .
k np n
Pn ( k )
Pn ( k )
p
P
pq
.
n 2
125
Note
That the expression
n
( x y )n
k 0
n k n k
k x y
126
Symmetry
Pascals Triangle
FG nIJ FG n IJ
H kK H n kK
Factorial
FG nIJ n FG n1IJ
H k K k H k 1K
Addition
Product
FG IJ FG IJ FG IJ FG IJ
H k K H j K H jK H k j K
Computational Methods
FG nIJ n j 1
H kK
j
FG nIJ FG nIJ FG n j k 1IJ
H kK
H kK
H j K
n
n j
j 1
n 1
n 1
j k 1
j k 1
FG nIJ FG nIJ 1
H 0K H nK
FG nIJ FG n IJ
H r K H n rK
FG nIJ FG n IJ FG n1IJ , 1 r n
H r K H n rK H r K
FG 0IJ
H 0K
FG1IJ
FG1IJ
H 0K
H1K
FG 2IJ
FG 2IJ
FG 2IJ
H 0K
H1 K
H 2K
FG 3IJ
FG 3IJ
FG 3IJ
FG 3IJ
H 0K
H1K
H 2K
H 3K
Random Variable
The degree of understanding a phenomenon is
inversely proportional to the number of variables
used for its description
- Unknown Physicist
127
128
TH
HT
TT
0
A mapping of
S = {HH, HT, TH, TT}
into the real line
R
1
x
set A S maps to I R1
si
sj
s21
X(t;s i )
s4
s1
Set A
s10
sk
s5
s15
Interval I
P X I = P[ A]
= s1, s2,, s is the set of outcomes
k
s50
Definition:
Suppose that (S, F, P) is a probability space in
which S is not necessarily countable. A Random
Variable, X, defined on this space is a function from
S into the real line such that the set {|X() x} F
for every real x
A Random Variable, X, defined on the probability
space is a function that assigns real value number
X() to every random outcome S
Translated, a Random Variable is a real value
function that associate a real number with each
element in the sample space
Note:
The function that assigns value to each outcome is
fixed and deterministic, e.g., number of heads in
three tosses of coin
However, the outcome of the experiment is not
known
No matter how careful a process is run, an
experiment is performed, or a measurement is
taken, there will be variability when the action is
repeated
If the outcome is already a numerical value, then we
can make the assignment X() =
Example 20
135
Example 21
Example 22
X() =
Continuous
P[ X x],
FX ( x)
P X xk u ( x xk ), Discrete
k
140
141
FX ( x) P[ X x]
FX(x)
1
1/2
1/6
Continuous x
pf
Discrete S x 0, 1, 2
Properties of CDF - 1
1
3/4
1/4
1
5) FX ( x ) is continuous from the right, i.e., for any b, and for h > 0
FX (b) = lim FX (b h) FX (b )
h0
143
Properties of CDF - 2
Department of Telecommunications Engineering
7) P[ X b]
2) FX () 1
3) FX ( ) 0
if FX (x) is continuous at b
8) P[ X x] 1 FX ( x)
(c) Prof. Okey Ugweje
Continuous
P[ X x],
FX ( x)
P X xk u ( x xk ), Discrete
k
1) 0 FX ( x) 1
8) P[X > b] =1 FX ( x)
Example 24
Department of Telecommunications Engineering
Given that
x0
1 e 2 , x 0
FX x
0,
x0
# of heads is < 0
0 x1 # of heads = 0
(2) FX 0
(3) FX 1e 1
Example 23
Department of Telecommunications Engineering
0,
1
,
4
( x)
3,
4
1,
(4) FX x1 FX ( x2 ), x1 x2
# of heads 2
(5)
FX x FX ( x)
FX x P X x P X x
FX ( x ) P X x P X x
147
146
Example 25
1) P a X b FX b FX a
R| 0
x
F ( x) S
16
|T 1,
4
x0
2) P a X b FX b FX a P X a
FX(x)
0 x 2
2 x
4) P a X b FX b FX a P X b
5) P a X 1 FX a
3
3
F P[ X ]
2
2
3
1
3
1
P X F F
X
X
2
2
2
2
(c) Prof. Okey Ugweje
3) P a X b FX b FX a P X a P X b
1 3 4
1 1 4
16 2
16 2
6) P X a 1 P[ X a ] 1 FX a
150
Example 26
Department of Telecommunications Engineering
14 1 e
1
4
0.2212
152
Properties of PDF
( x)
d
dx P X xk u x x , p ( x ) x x ,
dFX ( x )
dx
fX
2)
continuous
2) P[ X x k ] 1
f X ( x)dx FX () 1
3) FX ( x)
f X ( x)dx
3) F ( x) P[ X x k ]
discrete
1) 0 P[ X xk ]
1) 0 f X ( x)
Discrete PDF:
153
4) P[a X b] P[ X x k ]
k a
P[ X a] P[ X a] F(a)
(c) Prof. Okey Ugweje
f (x)dx
Example 27
x,
fX x
0,
else
Solution
1
x dx 01 xdx 01 xdx
0
x2
x2
2 1 2 0
1
155
154
Example 28
z f xdx
2) P a X b z f xdx
3) P a X b z f xdx
4) P a X b z f xdx
5) P a X z f xdx
1) P a X b
f X ( x ) ce x , x
Solution
First find C
1
ce x dx 0 ce x dx
ce x dx
2 0 ce x dx
Note
for any real number a, a- < a < a+, with a-, a+
arbitrarily close to a
c x
2
e
c
2
0
P X v
v e
2
1 e v
157
dx 2 0v e x dx
2
Conditional CDF
Conditional PDF
Conditional Distribution
From the definition of conditional probability, we obtain the
definition for conditional CDF
Conditional Density
From the definition of conditional probability, we obtain the
definition for conditional CDF & PDF.
P[ A B]
P[ X x B]
FX ( x|B) P[ X x| B]
P[ B]
P[ B]
where A is the event {X x}
P[ A|B]
f X ( x| B) dFX ( x| B)
dx
Properties
Properties:
1) 0 F( x| B) 1
2) F(| B) 1
2)
3) F(| B) 0
3) F ( x|B)
4) P[ x1 X x2 ] F ( x2 |B) F ( x1| B)
f ( x|B)dx FX () 1
f ( y| B)dy
x2
x1
f ( y| B)dy
6) P[ x1 X x2 | B] F( x2 | B) F( x1| B), if x1 x2
(c) Prof. Okey Ugweje
159
160
I X ( )
Poisson RV
Hypergeometric RV
Zeta RV
161
k p
x
x
S X 0, 1
1-p
p
B1,p
R|
S|
T
RS
T
162
FH IK
FG nIJ p (1 p)
H kK
k
nk
, k 1, 2,
RS1,
T0,
1-p
1
163
...
p
1-p
1-p
p
Bn,p
164
pk ( x) P[ X k ] p(1 p)
k 1
, k 1, 2,
1-p
1
1
1-p
165
k 0,1, , r 1
1-p
p
Gn,p
166
k
k!
FG nIJ p (1 p)
H kK
k
n l arg e, p small, = np e
k!
Proof:
n k
nk
P[ X k ] p (1 p )
k
, k 0,1, 2,...
n! k
1
n k !k ! n
n
nk
1
n ( n 1) ( n k 1)! k n
k ! k
nk
nk
It is assumed that
= average number per unit of time
, k r, r 1,
p X ( k ) P[ X k ]
...
p
1-p
k r
G1,p
R|FH kr11IK p (1 p)
S|0,
T
r
pk ( x) P[ X k ]
1
n
167
168
e1 n jk e , e1 n jn k 1, n(n 1)k(n k 1) 1
Hence
P[ X k ]
k e
k!
f X ( x) e
k 0
k
xk
FX ( x) e
k 0
k!
k
u xk
k!
169
pk ( x) P[ X k ]
where
C
k
, k 1, 2,
FH
1
L
1I O
F
CM H K P
N k Q
170
k 1
a b
IK
FG aIJ FG b IJ
X ~ ha x; n, a, bf H xK H n xK
FG a bIJ
HnK
FG aIJ FG b IJ
p ( x) P[ X k ] H xK H n xK , k 0,1,, a
FG a bIJ
HnK
k
171
172
4. Hypergeometric:
P ( X 0) q,
P ( X 1) p .
2. Binomial: X ~ B(n,p)
P( X k )
P(X k)
n
P ( X k ) p k q n k ,
k
k 0 ,1 , , n .
12
N m
n k
,
N
n
max(0, m n N ) k min( m, n )
5. Geometric: X ~ g(p)
P ( X k ) pq k , k 0 ,1 , 2 , , ,
3. Poisson: X ~ P()
P ( X k ) e
m
k
q 1 p.
k!
k 1 r kr
P(X k)
,
p q
r 1
, k 0 ,1 , 2 , , .
P(X k)
7. Discrete-Uniform:
P(X k)
k r , r 1, .
173
1
, k 1, 2 , , N .
N
174
Uniform RV
Gaussian (Normal) RV
Cauchy RV
Rayleigh RV
Nakagami RV
Beta RV
Chi-squared RV
Pareto RV
Exponential RV
Gamma RV
Laplacian RV
Rician RV
Weilbull RV
Log-normal RV
Erlang RV
Student F distribution, etc
176
An uniform RV is given by
fX
R| 1 ,
( x) S b a
|T0,
R|0,
x a
F ( x) S
,
b
|T1, a
X
R|e(x a) ,
f ( x) S
T| 0,
R|1 e(x a) ,
F ( x) S
T|0,
f X ( x)
1
ba
a x b
otherwise
a
xa
FX ( x)
1
a x b
b x
f X ( x)
xa
xa
xa
R|
( x a)
1
f ( x) S ( x a)e 2 ,
|T0,
R| (x a)
F ( x) S1 e 2 , x a
|T0,
xa
P[ X s t | X t ] P[ X s ]
Proof:
From conditional probability definition, one obtains
P[ X s t , X t ]
P[ X t ]
xa
xa
f X ( x)
P[ X s t ]
P[ X t ]
P[ X s ] P[ X t ]
P[ X t ]
P[ X s ]
(c) Prof. Okey Ugweje
FX ( x)
P[ X s t | X t ]
xa
a=0
The Rayleigh RV with parameter =1 corresponds to the Chisquared with 2 degree of freedom
The square of Rayleigh RV with parameter corresponds to the
exponential RV with parameter 1/(2)
The Rayleigh PDF and CDF are commonly used in
communication
179
180
R|
S|
T
LM
N
x
x
f X ( x) 2 exp 2 2
0,
2
OP I F xI,
Q H K
o
x0
x0
FH a , x IK
b b
FX ( x ) 1 Q
where
181
, x
var( x)
2
2
182
R| x
S|0,
T
f X ( x)
f X ( x)
FX ( x) P[ X x]
OP
Q
m 2
x , x0
x0
LM
N
exp
f X ( x)
2m 1
m = 1 Rayleigh PDF
m = 0.5 One sided Gaussian
2
2
Q , x exp x I dx
2
(c) Prof. Okey Ugweje
R| 2 e mj x
S|0, m
T
f X ( x)
1 1
x
tan 1
1 x
, x 0, 0, 0
f X ( x)
x0
where
x 0 x 1e x dx,
FX ( x ) G
x, 1 Incomplete Gamma Function
Note that
e21j
1
m1 m!, m 1, 2, m m1!
(c) Prof. Okey Ugweje
183
184
f X ( x) e x 1 , x
2
1 x
f X ( x)
x e , x0
R|21 e (x a) ,
F ( x) S
|T1 21 e (x a) ,
n 1 k
FX ( x ) 1e x k ! x k
k 0
x a
a x
(x)
x
(c) Prof. Okey Ugweje
185
186
fX
R|
( x) S
|T0,
FH x a IK
x a 1 exp
R|1 expLF (x a) I O,
MN H K PQ
F ( x) S
|T0,
fX
, x a, 0, 0
xa
a f
R
x
|
( x ) S a f
|T0,
FX ( x)
xa
RSI a , f,
T0,
X
1 x 1 ,
0 x 1
otherwise
x 1
f X (x)
x 1
0
xa
187
188
1
n
2n / 2
2
n 1
2
f X ( x)
x
exp , x 0
2
e 21 j e 21 j
m n
mx n
mn
2
f X ( x)
FX ( x) G n , x
2 2
21 m n mm / 2 n n / 2 x m / 2 1
2 (n) G n2 , 2
(c) Prof. Okey Ugweje
189
Pareto Variable
Pareto Variable
f X ( x)
|RS
T|0,
1
x 1
190
f X ( x)
x
otherwise
F
e2 j H
2
1
x
1 2 1
1
I
K
191
192
Gaussian (Normal) RV - 1
Department of Telecommunications Engineering
Score
193
194
Gaussian (Normal) RV - 3
( x )2
exp
, x , 0
2 2
2 2
1
f X ( x)
95% within
2 standard deviations
68% within
1 standard deviation
f X ( x)
34%
2.4%
0.1%
0.1%
13.5%
-3
(c) Prof. Okey Ugweje
-2
+2
+3
(c) Prof. Okey Ugweje
2 2
X ~ N , 2
3
195
b g
0607
. a
13.5%
2.4%
x
196
Gaussian (Normal) RV - 4
Gaussian (Normal) RV - 5
2
x
1
2
1 2
Gaussian curves with 1 2 and 1 2
2
x
It is the most important of all densities and models more different random
occurrences than any other PDF
The most widely used model of noise in communication systems
Federal University of Technology, Minna
197
Gaussian (Normal) RV 6
198
Gaussian (Normal) RV - 7
FX ( x) P[ X x]
x
199
(t ) 2
exp
dt
2
2
2
1
200
Gaussian (Normal) RV - 8
Gaussian (Normal) RV - 10
X ~ N ( , 2 )
X ~ N (0, 2 )
X ~ N (0,1)
201
Gaussian (Normal) RV - 11
Hence,
1
2
FX (a )
F t I dt
H 2K
exp
2
a exp x dx
2 2
2 2
x
(x)
( x)
y2
1
exp ( dy )
2
2
Also
x 1 x
Hence
FX (a) P[ X a]
X a
a
P
LM
N
FX (a ) P[ X a ]
(c) Prof. Okey Ugweje
202
Gaussian (Normal) RV - 12
( x)
exp y dy
2
2
x
y2
x
1
exp dy
2
2
2
FX ( x)
=1
=0
OP F I
Q HK
1
2
a
exp
y2
1
a
exp
dy Q
a
2
2
Q( x) 1 FX ( x)
z FH y2 IK dy
exp
x
Q function
x 2
2 2
1
2
dx
203
204
Gaussian (Normal) RV - 13
Gaussian (Normal) RV - 14
Q( x) 1 Q x
Q( x) 1 x
Q( x)
F a I
HK
Q(0) 1
2) P X a 1 P X a 1 Q
3) P a X b FX b FX a P X a P X b
or
a
b
Q
P a X b Q
F I F I
HK HK
a I
4) P X a 1 P X a 1 QF
HK
205
Gaussian (Normal) RV - 15
Lognormal RV
i 1
R| 2
L ln(x a) b OP , x a
f ( x) S 2 x a expMN
Q xa
2
|T0,
R| 1 L ln(x a) b O z x expF z I dz, x a
PQ H 2 K
F ( x) S 2 MN
2
|T0,
xa
2
2
N (0,1) ~
i
i 1
206
The ratio of two independent unit Gaussian RV, N(0,1), is the standard Cauchy
The sample mean of n-independent and identically distributed RV each with
mean m and variance 2, tend to be Gaussian distributed with mean m and
variance 2/n, as n 0
(c) Prof. Okey Ugweje
207
208
Example 29a
Department of Telecommunications Engineering
Example 29a
Department of Telecommunications Engineering
(a)
2 3 X 3 5 3
1 X 3 2
P 2 X 5 P
3
3
3
3
3
3
1
2
P z
3
3
Q x 1 x
2
1
3
3
2
1
2
1
1 1
3
3
3
3
Example 29a
Department of Telecommunications Engineering
b) P X 0 P X 3 0 3 P z 1
3
3
1 1 1 1 1 1
0.8413
Example 29b
Department of Telecommunications Engineering
5
5
5
v2
6
P 1 z
1 Q 1 Q 1
5
Q 1 Q
5
1 Q 1 Q 1.2
1 0.1587 0.1151 0.7262
c) P X 3 6 P X 3 6 P X 3 6
P X 3 P X 9
X 3 3 3
X 3 9 3
P
3
3
3
3
P z 2 P z 2
2 1 2
1 2 1 2 2 1 2
2 1 0.9772 0.0456
212
Statistical Properties of RV
Department of Telecommunications Engineering
Statistical Properties of
Random Variables
Rowes Rule: the odds are six to five that the light
at the end of the tunnel is the headlight of an
oncoming train.
Paul Dickson
Federal University of Technology, Minna
213
214
Expectation of a RV - 2
- xf X ( x)dx, continuous
x m x X E[ X ] n
xk p ( xk ), discrete
k =1
f ( x4 )
f (x3)
f ( x5)
f ( x2 )
f ( x1)
EX
EX
x1
x2
x3
x4
x5
215
(a)
(b)
216
Expectation of a RV - 3
Expectation of a RV - 4
1) E c c, c isaconstant
2) E cX cE X
3) E X c E X c
4) E X Y E X E Y
xf X ( x ) dx x f X ( x) dx
5) E X E Y , if P X Y 1
6) E[ X ] E X
7) E X 1 X 2 X N E[ N ] E X
217
Expectation of a Function
218
Example 30a
E X x
xf X ( x) dx
E Y E g ( x) ?
First find the PDF of Y and then use the definition to find
E[Y], or
Calculate the expectation directly using the definition as in
2
1 x
ba 2
- g ( x) f X ( x)dx
g ( x) E[ g ( x)]
g ( xk ) P X xk
k
1 b
xdx
ba a
1 1
1 1
b 2 a 2
2 b a b a b a
2 ba
b a
1 1
1 1
b 2 a 2
b a b a
2 ba
2 ba
2
Also
2
E X
3
1 b 2
1 x
x f X ( x) dx
x
dx
ba a
ba 3
1 1
b3 a 3
3 ba
(c) Prof. Okey Ugweje
219
220
Example 30b
Example 31
fX(x)
1 , 0 x 10
f X x 10
else
0,
1
10
10
bx
f X ( x)dx 0 Ke dx 1
K
e bx 1 K b
0
b
E X
xf X x dx
10
0 x
Kebx , x 0
f X ( x)
x 0
0,
E X 0 xbe bx dx
1
dx
10
10
x2
5
20 0
(c) Prof. Okey Ugweje
221
xb e bx
xb e bx
0
0
0 e bx dx
1
b
e bx
1
b
222
Example 32
Solution
E X x xk p xk
m X
2
X
- x 2 f X ( x)dx, continuous
X E[ X ] n 2
discrete
xk p ( xk ),
k=1
2
k 1
X RMS E[ X 2 ]
223
224
Example 33
E[ X n ] X n - x n f X ( x)dx
Solution:
E[ X x ] X x
2
E X 2
x f X ( x)dx
- X x
3
1 b 2
1 x
x
dx
ba a
ba 3
f ( x)dx
X
1 1
b3 a 3
3ba
225
Example 33
x2 var X E[(X x ) 2 ]
2 x 2 e x
1 x
0 e dx
0
- (X x ) 2 f X ( x) dx
continuous
2
g (X k x ) P X xk discrete
k
2 1 x
2
e 2
x 2e x
2 x
E X 2
0 xe dx
0
Special Cases:
when n = 1, the first central moment is zero
when n = 2, the 2nd central moment is called the variance, i.e., the
variance is the second central moment
x m e ax m m 1 ax
x e dx
a
a
226
E[ X x ] X x - X x f X ( x)dx
2 x
0 x e dx
f X ( x ) e x , x 0
x m e ax dx
228
Properties of Variance - 1
var[ X ] X2 X x E[ X x ]
2
If E X , then E aX b a b
E X 2 Xx Xx xx
E X 2 xE[ X ] xE[ X ] E[ xx ]
2
Var aX b E aX b a b
X2 X X X X X X
E aX a 2 a2 E X 2 a2Var X
X X X
2
E[ X ] E[ X ]
2
a f
Standard deviation:
var[ x]
(c) Prof. Okey Ugweje
229
Properties of Variance - 2
Example 34
Proof:
Suppose that n = 2,
E[X1] = 1, E[X2]= 2, then E[X1+X2] = 1 + 2
EX x
var X 1 X 2 E[ X 1 X 2 1 2 ]
2
E[ X 1 1 ] E[ X 2 2 ] 2 E[ X 1 1 X 2 2 ]
2
var[ X 1 ] var[ X 2 ] 2 E[ X 1 1 X 2 2 ]
E[ X 1 1 X 2 2 ] E[ X 1 1 ]E[ X 2 2 ] ( 1 1 )( 2 2 ) 0
1 1
b3 a 3
3 ba
Vax X E x 2 E x
b a 1 1 b3 a3 2
3 b a
2
2
231
You can also use the brute force method shown below
2 E x 2 E x
E X 2
b a
E[ X 1 1 X 2 2 2 X 1 1 X 2 2 ]
Hence
230
1 b
ab 2
dx
a x
2
b 1
232
Example 34
Example 35
The variance is
1 b
ab
2
Var X E x x
x
dx
a
ba
2
2
1 1
ab
x
3 b a
2
Kebx , x 0
f X ( x)
x0
0,
3b
3
1 1 b a a b
3 b a 8
8
3
1 1 b a
3 b a 4
b a
12
233
234
Example 35
Department of Telecommunications Engineering
0 X b1 be bx dx
2
b 0 x 2 e bx dx b2 0 xe bx dx b12 0 e bx dx
b xb e bx 2b2x e bx b23 e bx
2
0
b
2
b2
xe
bx
b13 e bx b b13 e bx
0
0
William Osler
235
236
Characteristic Function - 1
237
X ( ) E e j X
- e j X f X ( x)dx, continuous
j X
e k p X ( xk ), discrete
k
The characteristic function will exist only if the integral or
the sum specified above converges
X() can be interpreted as the expectation of a function of
X, denoted as Y = ejX, with unspecified
X() can also be interpreted as the Fourier Transform (FT)
of the PDF fx(x) of the random variable X with the sign of
reversed
Characteristic Function - 2
238
Characteristic Function - 3
f X ( x)
1
2
X ( )e
f X ( x) X ( )
jX
X ( f ) F x(t )
x(t )e j2ft dt
x(t ) F
X( f )
X ( )
jxf X ( x)e j X dx 0
d
0
jxf X ( x )dx jE X
X ( f )e j2ft df
E X
xf X ( x)dx
2
d2
( )
jx f X ( x)e j X dx 0
d 2 X
0
j 2 x 2 f X ( x )dx j E X 2
z
z
E X
2
E X
x f X ( x)dx
n
dn
( )
jx f X ( x)e j X dx 0
d n X
0
x 3 f X ( x)dx
E X
n
j n x n f X ( x )dx j E X n
x n f X ( x)dx
239
240
Characteristic Function - 4
Example 37
LM
N
OP
Q
1 d
X ( )
j d
0
LM
N
e x , x 0
f X ( x)
x0
0,
OP
Q
1 d
E X 2
2 X ( )
j d
0
2
Hence,
LM
N
OP
Q
d
E X n 1n
X ( )
n
0
j d
241
242
R|z e
S
|T e
M X (t ) E e
tX
-
n
tX
tX k
k 1
f X ( x)dx, t 0 continuous
P X xk
discrete
Hence,
t 2 E[ X 2 ] t 3 E[ X 3 ]
2!
3!
M X (0) 1
M Xn (0) E[ X n ]
(c) Prof. Okey Ugweje
OP
Qt 0
M X (t )
Property:
Y aX b MY (t ) ebt M X at
X M X (0) M 1X (0)
2
LM d
Ndt
E X
243
244
Example 38
Department of Telecommunications Engineering
GX (z) E z X pX ( xk )z x
x0
P X k zk 1
E X GX ( z )
dz
z 1 k 0
z 1
xP X x E[ X ]
x 0
245
246
Example 39
d2
E X 2 2 GX ( z )
P X x x( x 1) z x 2
dz
z 1 x 0
z 1
x( x 1) P X x E[ x( x 1)] E[ X 2 ]
x 0
d
E X n n GX ( z )
E[ x( x 1) ( x n 1)]
dz
z 1
LM
N
OP
Q
d2
d
d
GX ( z ) GX ( z )
GX ( z )
2
dz
dz
z 1 dz
z 1
z 1
Federal University of Technology, Minna
247
248
Laplace Transform
Example 40
LX ( s ) E e sX 0 e sX f X ( x)dx
LM
N
E X (1)
n
dn
dz n
L X (s)
OP
Q
s0
LX (s)
249
250
Tail Inequalities - 1
Department of Telecommunications Engineering
Tail Inequalities
It is always better to be approximately right, than
precisely wrong.
1. Markov Inequality
2. Chebyshevs Inequality
3. Chernoff Inequality
- Unknown Engineer
251
252
Tail Inequalities - 2
Tail Inequalities - 3
1. Markov Inequality:
If X is a RV that takes nonnegative values, then for any value k
>0
2. Chebyshevs Inequality:
PX k
Proof:
E X
k
I st order bound
b xf X ( x)dx b kf X ( x) dx
kP[ X k ]
Hence,
E X P X k
253
a f
E X
k2
E X
2
k
2
k
P X k 1 2
254
or P X k 1
2
or
P X k 1
1
k2
Tail Inequalities - 5
Proof:
Chebyshev Inequality is a consequence of the Markov
Inequality
Tail Inequalities - 4
Department of Telecommunications Engineering
P X k2
255
256
Tail Inequalities - 6
3. Chernoff Inequality:
If X is a RV and for any value k > 0
P e k
tX
M X t
k
R|P X k ekt M
S| P X k ekt M
T
(t ), k E[ X ]
(t ), k E[ X ]
257
P lim n 1
n
258
Let X1, X2, , XN, be a sequence of iid RVs, each with finite
mean and finite variance 2
Let Sn = X1 + X2+ + Xn, n > 1, and let Zn be a sequence of
unit variance, zero mean RVs, defined as
lim P n 1
n
259
Z Sn n
n
n
Then,
x2
1 z 2
dx
lim P Z n z N 0,1
e
n
2
260
Transformation of a
Random Variable
The laws of probability, so true in general,
so fallacious in particular.
Edward Gibbon
261
262
X
fX(x)
g(.)
Y
fY(y)
SX
(S, F, P)
A
(A, E(A), PX)
SY
(S, F, P)
g: X Y
Y g( x)
X g1( y)
264
y b
a
y b
y b
F a yf P L X
MN a OPQ F FH a IK
Hence
a f LNM
OP LM
Q N
F
IK
1 F H
FY y P X
265
x y b
a0
Yy
y b
X
a
Steps:
1) Solve for x in the given equation in terms of y
2) Substitute into the above equation
(c) Prof. Okey Ugweje
x x dx
Case 2: a < 0
FY ( y) FX g ( y)
Y aX b
y b
y b
P
X
a
a
y b
a
OP
Q
a0
y b
a
Yy
266
2. Square Function: Y = X2
R| FH IK
af S F I
|T H K
y b
FX
,
a 0
a
FY y
y b
1 FX
, a 0
a
X y
Case 1: y 0
yx
af
FY y P y x y FX
b y g F b y g
X
Case 2: y < 0
There is no value of X for which x2 <y. Hence FY y P 0
af
a f RS0F, b y g F b y g,
T
FY y
267
y 0
y 0
268
X
fX(x)
g(x)
Y
fY(y)
dx
fY ( y) f X ( x) f X ( x) f X ( x)
d
dy
dy
g( x)
g ( x)
dx
269
y
fX
,
fX
a
a
x y
x y
270
y0
y0
xo = cos ( y), 0 x
1
FG JI
HK
F
H
I
K
1
2
1
1
2 2
B f ( x)
1
fY ( y) 1 f X ln y
a
y
2 cos ( y)
1
b
g d cosdy ayf f b2 cos (y)g d 2 dycos ayf
f bcos ( y)g f b2 cos ( y)g 1
1 y
1
2
cos 1 ( y)
-1
fY ( y) f X cos 1( y)
dx
1
1
ln y ax x ln y
a
dy ay
y
1
x1 = 2 cos1( y), x 2
a
fY ( y) a2 f X
y
y
1
a
dy
a
g' ( x )
2 y2
y
dx
x
a
FH IK
y b
1
fX
a
a
f Y ( y)
dy d 2
y
aX 2 ax 2 a
2 ay ,
dx dx
a
1
dx
dy 2 ay
f y 2 ay
Y
0,
yb
a
dy
d
aX b a
dx
dx
dx
Steps:
1) Given y = g(x), solve for x in the given equation in terms of y
2) Find d
dy
dx
dx
dx
271
1
1 y2
Federal University of Technology, Minna
272
1 1
f ( y) L O
MN2 2 PQ
Y
y 1
fY ( y)
1 y 1
y 1
x0
x1
x2
x3
273
a y
f X ( xn ), y a
xn tan 1 y dy 1 1 y2
dx cos2 x
fY ( y) 1 2 f X ( xn )
1 y n
x4
1
2
Y a sin x
dy a cos x a 2 y 2
n
dx
By integration
R|0,
sin ( y)
F ( y) S 1
,
2
|T1,
F y I
H aK
Y asin x x sin1
1
1 , 1 y 1
2
1 y 2
1 y
274
When there are more than one RV, we talk about joint
events from the same sample space
Any ordered pair of numbers (x, y) can be considered
as a point in the xy plane
Y
S1
X(s2), Y(s1)
Y
S2
SJ
Let A = {X x} and B = {Y y}
Events A and B refer to the sample space S, while events {X
x} and {Y y} refer to the joint sample space SJ
k p
X x Y y X x, Y y
(c) Prof. Okey Ugweje
275
276
SJ
S
A
A B
k p
B Yy
277
278
Joint CDF
279
280
1) 0 FXY ( x, y) 1
2) FXY ( x, y) is a nondecreasing function of both x and y
P X x, Y y , continuous
F (x,y ) =
XY
discrete
P X x, Y y ,
4) FX (,) 1
y
b
Properties:
The properties of joint CDF is similar to that of the single variable
b2
(a1,b2)
(a2,b2)
b1
(a1,b1)
(a2,b1)
281
282
6) FXY ( x,) FX ( x)
a2
a1
(c) Prof. Okey Ugweje
1) P[ X a, Y b] FXY (a , b )
P[ X a, Y ]
2) P[ X a, Y b] 1 FX (a )-FY (b )+FXY (a ,b )
3) P[a1 X a2 , b1 Y b2 ]
7) FXY (, y) FY (y)
P[ X , Y b]
b
x
Note:
The first 5 properties are just the 2-dimensional extension of
properties of one random variable
Properties 3, 4, and 5 may be used to test whether a given
function is a valid joint CDF
As in the case of a single RV, joint CDF can be used to compute
probabilities of unions and intersection of semi-infinite rectangles
(c) Prof. Okey Ugweje
283
b2
a1
a2
x
b1
5) P[ X a, b1 X b2 ] FXY (a , b2 ) FXY (a , b1 )
(c) Prof. Okey Ugweje
284
6) P[a1 X a2 , b1 Y b2 ]
=FXY (a2 , b2 )+FXY (a1 , b1 )-FXY (a1 , b2 )-FXY (a2 , b1 )+P X =a1 , b1 <Y b2
where
P X a, b1 Y b2
lim FXY (a 1n , b2 )- lim FXY (a 1n ,b1)- lim FXY (a 1n ,b2 ) lim FXY (a 1n , b1)
n
7) P[a1 X a2 , b1 Y b2 ]
FXY (a2 , b2 )+FXY (a1 , b1 )-FXY (a2 , b1 )-FXY (a1 , b2 )-P Y b2 , b1 <Y b2
285
RSF (x,)
Tz z f (, y)ddy
XY
XY
F (,y)
XY
F (y) =
Y
f (x, )d dx
XY
286
f XY ( x, y) = d FXY ( x, y)
dxdy
It is assumed that X & Y are jointly continuous, else
the derivative may not exist
It follows that
Joint PDF
287
XY
( x, y) =
zz
x y
f XY ( , )dd
288
Properties:
1) f XY ( x, y) 0
2)
z z
PA
f XY ( x, y)dxdy FX () 1
z z
4) F ( x) z z f
5) F ( y) z z f
3) FXY ( x, y)
x
XY
f XY ( , )dd
XY
7) f X ( x)
z z f ax, yfdxdy
2) P a X b, c Y d z z f a x, yfdxdy
3) P a X b, c Y d z z f a x, yfdxdy
4) P a X b, c Y d z z f a x, yfdxdy
5) P a X b, c Y z z f a x, yfdxdy
6) P X a, c Y d z z f XY a x, yfdxdy
b d
a c
XY
b d
XY
b d
a c
f XY ( x, y)dxdy
XY
b d
a c
f XY ( x, y)dy
XY
a c
8) f Y ( y) f XY ( x, y)dx
XY
a d
Note:
Properties 1 and 2 are sufficient to test the validity of joint
PDF
(c) Prof. Okey Ugweje
f XY ( x, y)dxdy
a c
( , )dd
6) P a1 X a2 , b1 Y b2
1) P a X b, c Y d
( , )dd
b2 a2
b1 a1
zz
289
a c
290
Joint PMF
XY
(x,y ) =P X x, Y y
291
292
FXY (x , )
(x ) = x
f XY ( ,y )d dy
F (,y)
XY
F (y) =
y
Y
f (x, )d dx
XY
z
z
dx
f Y ( y) = f XY ( x, y)dx = d FXY (, y)
dy
(c) Prof. Okey Ugweje
293
294
1. Independence of X and Y
Also
ij X iY j E[ X iY j ]
i j
- x y f XY ( x, y )dxdy, continuous
i j
discrete
xn yk p XY xi y j ,
n k
The sum of i+j is called the order of the moments
Given a function z = g(x,y), we can 1st compute the PDF of
Z and then compute the mean of z
or we can
as follows
E[ Z ]compute
zf Z (directly
z )dz
P X x, Y y P X x P Y y
295
296
Correlation of X and Y - 1
- g ( x, y ) f XY ( x, y )dxdy, continuous
E[ g ( x, y )]
discrete
g xi , y j p XY xi y j ,
n k
11 RXY E[ XY ]
- xyf XY ( x, y )dxdy
RXY E[ X ]E[Y ]
E[ X i ]E[Y j ]
Thus
297
Correlation of X and Y - 2
Note:
When RXY = E[X]E[Y], X and Y are said to be
uncorrelated
Independence Uncorrelatedness
Uncorrelatedness (not always) Independence
(c) Prof. Okey Ugweje
298
P A|B
ij E[ X X Y Y ]
i
P A B
PB
j
i
- x X y Y f XY ( x, y )dxdy
299
300
Conditional PMF - 1
Conditional PMF - 2
Discrete
The joint conditional pmf of X given that Y = y is given
by
P X xi |Y y j
P X xi , Y y j
P Y y j | X xi
P Y yj
P X xi , Y y j
P X xi
pXY xi , y j
pY y j
P X xi |Y y j
P X xi P Y y j
P X xi p X ( xi )
P Yyj
P Y yi | X xi
P X xi P Y y j
P Y y j pY ( y j )
P X xi
P Y yj
pXY xi , y j
p X xi
Similarly
FY y | xi FXY Y | X xi P Y y | X xi
301
Conditional Density - 1
P X xi
302
Conditional Density - 2
FXY x | y lim P X x | y Y y y
y y
P Y y j P X xi 0
Hence * and ** are undefined for continuous RV. Fortunately,
the numerators are also zero
We say that * and ** are limiting cases for continuous RV
For X and Y jointly continuous, we obtain the following
x y y
f XY , d d
y
P X x | y Y y y
y y
fY d
y
y f XY , y ' d
, y y ', y '' y y
P X x | y Y y y
y fY y ''
x
x
f XY , y d
FXY x | y
fY y
Consequently
f XY x | y
f x, y
d
FXY x | y XY
,
dx
fY y
f XY y | x
f x, y
d
FXY y | x XY
fX x
dy
Also,
ab g z dz b c g (c), a c b
P Y y, X xi
303
f XY x | y
(c) Prof. Okey Ugweje
f XY y | x f X x
,
fY y
f XY y | x
f XY x | y fY y
fX x
304
Conditional Expectation - 1
Conditional Density - 3
Department of Telecommunications Engineering
yfY ( y | x)dy, continuous
E Y | x
y j pY y j | x , discrete
j
f XY x | y f X x ,
f XY y | x fY y
305
Conditional Expectation - 2
Theorem 1:
For any random variables X and Y E[ E Y | x ] E Y
A) Covariance:
Proof
E Y | x f ( x)dx
E[ E Y | x ]
X
yf
XY y | x f X ( x) dxdy
y
E[ E Y | x ]
x, y
XY
f
306
E[ X X Y Y ]
- x X y Y f XY ( x, y )dxdy
f ( x)dxdy
X
yf
XY ( x, y )dxdy
E[Y ]
(c) Prof. Okey Ugweje
307
308
Cov X , Y E[ X X Y Y ]
B) Correlation Coefficient ()
E[ XY Y X X Y X Y ]
E[ XY ] Y E X X E Y X Y
E[ XY ] Y X X Y X Y
XY
Thus
By definition,
Cov X ,Y E[ XY ] X Y RXY X Y
Note:
If X and Y are either independent or uncorrelated, then
XY E
where
E[ XY ] E[ X ]E[Y ] Cov XY 0
If X and Y are orthogonal, then RXY = 0
fO, 1
PQ
XY
2X Var X X 2X
Y2 Var Y Y Y2
Cov XY E[ X ]E[Y ]
(c) Prof. Okey Ugweje
LMa X fa X
309
310
Z g ( x, y)
Transformation in Two
Dimension
If nature has taught us anything it is that the
impossible is probable
R f ( x, y )dxdy
Z
f Z z P z Z z dz
R f ( x, y )dxdy
Ilyas Kassam
(c) Prof. Okey Ugweje
FZ z P Z z P g ( x, y ) RZ
311
312
Z aX bY y
z aX
b
f Z z
Y
z
b
z
a
f Z z
z ax
FZ z P Z z P aX bY z b f ( x, y )dydx
(c) Prof. Okey Ugweje
z ax
b
y
f ( x) f ( y)dydx
f ( x) f z ax dx ...................................... A1
b
zz
FZ z
X
Z=aX+bY
Note:
313
314
Examples of Convolution
aX+bY < Z
Signal + noise
d
FZ z
dz
h(t)
fY(y)
fz(z)
a+c
b+d
a+c
b+d
Receiver Output
fX(x)
fY(y)
fz(z)
a(t )*b(t ) A( f ) B( f )
315
316
Let
Z X Y
such that
Transformation in 2 Dimensions
(1 function of 2 RVs)
E Z E X Y x y f ( x, y )dxdy
xf ( x, y )dxdy yf ( x, y )dxdy
xf ( x)dx yf ( y )dy
Hence
E Z E X Y E X E Y
317
318
Let
Z X Y , z x y
such that
a f E aZ a ff
E a X Y a ff E ka X f aY fp
var Z E Z z
Expanding
Var Z E X X
2Z 2X Y2
E Y Y
fa
2E X X Y Y
That is
Var Z Var X Var Y 2Cov X ,Y
2Z 2X Y2 2CXY
2Z 2X Y2 2 X Y XY
(c) Prof. Okey Ugweje
319
320
Z aX bY
Z ( ) E e j ax by X (a ,b )
Z aX bY
M Z (t ) E et ax by
t ax by
f x, y dxdy
e
ja X
1
jb Y
E e 2
X (a1 )Y (b 2 )
t ax
t by
M Z (t )
e
f x dx
e
f y dy
M X (t ) M Y (t )
Note
The technique for the sum of two RVs is applicable to the
difference of two RVs
321
322
Let X1 & X2 be jointly continuous RVs with joint PDF fX1X2(x1, x2)
We want to find the PDF of the random variables Z = X1 + X2
Define two new RVs Y1 and Y2 as a function of X1 and X2
Y1 g1 x1 , x2 ; Y2 g 2 x1 , x2
x1 h1 y1 , y2 ; x2 h2 y1 , y2
J h1h2
(c) Prof. Okey Ugweje
h1
y1
y1, y2
h2
y1
h1
y2
0 Jacobian
h2
y2
Federal University of Technology, Minna
a f
a f
z x y g1 x, y
w x g2 x, y
x w h1w, z
y z x h2 w, z z w
323
324
Jh h
1 2
h1w, z
w, z w
h2 w, z
w
a f
h1w, z
1 0
z
h2 w, z
1 1
z
a f
a f
z xy g1 x, y
w x g2 x, y
f
(w, z w)dw
XY
f z (z)
y z h2w, z
f
( x, z x)dx
XY
f z ( z)
f (z y) f ( y)dy
f ( x) fY (z x)dx
X
Y
X
Jh h
1 2
325
326
Z X
Y
a f
a f
z x g1 x, y
y
w y g2 x, y
1
fz (z) f XY x, z dx
x
x
e j
x zy h1w, z
y w h2w, z
1
1
f ( z ) f X x fY z dx f X z fY y dy
z
x
x
x
y
Jh h
1 2
0
1
1
w
w
fwz (w, z) f XY ( x, y) 1 f XY e x, wz j 1
a f
h1w, z
1
z
z
h2 w, z
w2
z
h1w, z
w, z w
h2 w, z
w
327
h1w, z
w, z w
h2 w, z
w
a f
h1w, z
z w
z
w
h2 w, z 1 0
z
Federal University of Technology, Minna
328
a f
y f XY zy, y dy
y y 2
exp
fY y
2 y2
2 2
If X and Y are jointly Gaussian, then joint PDF is
1
f ( z ) y f X zy fY y dy 2 x f X x fY x dx
z
z
z
Maximum Function
Z max X,Y
c h
f XY x, y
Minimum Function
Z min X,Y
(c) Prof. Okey Ugweje
x x 2
exp
2 x2
2 2
1
fX x
1
2 x y 1 2xy
where |xy|1
Federal University of Technology, Minna
329
LM LMMFH
expM N
MM
N
x x
x
I 2 2 xy F x x I FG y y IJ FG y y IJ 2 OP O
K
H x K H y K H y K PQ P
PP
2FH1 2xy IK
PQ
330
fXY ( x, y ) fX ( x) fY ( y )
331
332
333
X = X1 , X 2 , , X N
334
For N random variables X1, X2, ..., XN, the joint CDF is defined as
P[ X 1 x1 , X 2 x2 , , X N xN ],
FX ( x1 , , xN )
P[ X 1 x1 , X 2 x2 , , X N xN ]
Continuous
Discrete
FX ( x1,, xN 1) FX ( x1,, xN 1, )
2) FX (, ,,) 1
335
336
f X ( x1,, xN )
FX ( x1,, xN )
x1x2x N
f X ( x1)
x
1
z z z f
( x1,, xn )dx2dxn
FX ( x1 , , xn ) f X ( x1 , , xn )dx1dx2 dx
n
f X ( x2 x4 ) f X ( x1 , x2 , x3 , x4 ) dx1dx3
337
338
The N Random Variables X1, X2, ..., XN are independent if the events
X1 x1, X2 x2, ..., Xn xn are independent
This implies that
FX ( x1,, xN ) FX ( x1) FX ( x2 )FX ( xN )
f X ( x1, x2 ) f X ( x1) f X ( x2 )
f X ( x1, x3) f X ( x1) f X ( x3)
f X ( x2 , x3) f X ( x2 ) f X ( x3)
f X ( x1,, xN ) f X ( x1) f X ( x2 ) f X ( xN )
pX ( x1,, xN ) pX ( x1) pX ( x2 ) pX ( xN )
It follows that any subset of xi is a set of independent random variables
For example for N = 3 and x1, x2, x3 are independent, then
but
f X ( x1, x2 , x3) f X ( x1) f X ( x2 ) f X ( x3)
339
340
f X ( xN ,, xk 1| xk ,, x1) f X ( x1,, xk ,, x N )
f X ( x1,, xk )
E[ x1, x2 ,, xN ]
This implies that we can use chain rule to write the joint pdf as
f X ( x1,, xN ) f X ( xN |x1,, xN 1) f X ( x1,, xN 1)
f X ( xN |x1,, xN 1) f X ( xN 1| x1,, xN 2 ) f X ( x1,, xN 2 )
f X ( xN |x1,, xN 1) f X ( xN 1| x1,, xN 2 ) f X ( x2| x1) f X ( x1)
Correspondingly,
FX ( xN ,, xk 1| xk ,, x1)
z z
xN
xk 1
z z bx ,x ,,x g f
( x1 ,, x N )dx1dxN
z z
X N f X (X N )dXN
h z z gbx ,x ,,x g f
E[ x1, x2 ,, x N ]
( x1,, x N )dx1dxN
341
342
n n
1 2
E[ x1 1 , x2 2 , , xNn ]
h c X h c X h ]
z z b X g b X g b X g
E[ X 1 1
For N random variables X1, X2, , XN, the (n1 + n2 + + nN)order joint moment are defined by
n1
n
1
n2
nN
n
2
f X ( x1,, x N )dx1dxN
343
344
X 1, 2 ,, N
g E[e b
j 1 X1 2 X2 ,, N X N
z z
g]
b g a2 f1 K
f X XN
If Independent,
X 1, 2 ,, N E[e jb 1 X1g]E[e jb 2 X 2 g]E[e jb N X N g]
c h c h
c h
X 1 X 2 X N
z z
345
LM
N
i j
Xi X j , i j
OP
Q
346
k 1
k 1
LM
MN
2
1
LM
N
OP
Q
E ak X k ak E X k
1 2
22
OP
PQ
k 1
LM
N
OP RS c
Q T
gUVW
h b
a a E c X E X hb X E X g
a Varb X g a a Covb X X g
N
Var ai Xi E a j X j E X j ak Xk E Xk
i 1
j 1
k 1
N
j 1k 1
N
j 1
Independence is assumed
k 1
N1
OP
PP
PQ
E ak X k ak E X k
22
C1N
C2 N
CNN
2
Xi ,
21
ij
C12
C22
CN 2
11
R|
C S
|TC
LMC
C
KM
MM
NC
M X t1 M X t2 M X t N
(c) Prof. Okey Ugweje
If independent,
t X
t X
t X
M X t1 , t2 , , t N E[e 1 1 ]E[e 2 2 ] E[e N N ]
LM 1 aXmf K aXmfOP
N2
Q
exp
LM OP
mM P
MM PP
N Q
1
2
LM x OP
x
X M P,
MM PP
Nx Q
N
2
347
2
j
j 1k 1
jk
348
N X 1 X 2 X N N1 X k
k 1
1 N
X
N k 1 k
OP ELMFH
Q N
1 N
X
N k 1 k
IK OP
Q
2
2
X
1 N N
E
N 2 j 1 k 1
X j X k X2
1
2 E X 2j 2 E X j X k X2
N j 1
N j 1
1 N
E
N k 1
X k N1 N
k 1
j k
but
E X 2 X2 E X X2 X
2
349
350
hence
Var N
1
N2
N
1 N N
2
2
X X 2 E X j E X k X
1
j
N
j 1
k 1
1
N2
X2
j k
1
(N (N
N2
1) X2 X2
X2
Substituting
This means that the variance of n is 1/n times the variance of the RV
Xk
Var N 0 as N
2
P N a 2
Na
This implies that the probability that n is close to the true mean
approaches zero as N becomes larger and larger
LM
N
1 N
1 N
Var N E X X k X2
j
N
k 1
N j 1
Var N Var
351
352
Stochastic Processes
(a.k.a. Random Processes)
It is remarkable that a science which began with the
consideration of games of chance should have become
the most important object of human knowledge.
Laplace Pierre Simon, 1812
353
354
356
Definitions
355
Stochastic Processes - 1
Stochastic Processes - 2
357
Stochastic Processes - 3
S
Sn
A realization,
sample path,
or sample
function
x2(t)
xn(t)
t
tk
tk+1
Observation interval
X (t j , ) = X (t j ) is a random variable
X j t, , j 1, 2,, n
X (t , ) = X (t ) is a random process
X (ti , j ) = X (ti , j ) is a real number
358
Stochastic Processes - 4
S1 S
2
359
360
Stochastic Processes - 5
Stochastic Processes - 6
361
Stochastic Processes - 7
Stochastic Processes - 8
362
363
Random Process=
Discrete RP (countable collection of RVs)
364
Stochastic Processes - 9
Stochastic Processes - 10
a f
a f
a f
X1 X t1,s , X2 X t2 ,s ,, Xk X tk ,s ,
FX x,t P Xt x
af
af
af
fX x1,, xk ; t1,, tk
k FX x1,, xk ; t1,, tk
x1x2xk
a f
f X x, t FX x,t
x
af
af
pX x1,, xk P X1 x1, X2 x2 ,, Xk xk ,
Federal University of Technology, Minna
365
366
Stochastic Processes - 11
Department of Telecommunications Engineering
fX x1, x2 ;t1,t2
2 FX x1, x2 ; t1, t2
x1x2
f X ( x1 , x2 , xn , t1 , t 2 , t n )
As in random variables, marginal cdf and pdf of a RP is given
by
a f a
a f z f ax , x ;t ,t fdx
f X x1;t1
2 1 2
367
368
A.Mean of X(t)
First-order Random Processes (i.e., function of one
random process)
The mean of a random process X(t) is given by
m X (t ) X (t ) E X t
x(t ) f X x, t dx
369
X t
1 T
2T T
X t
1 T2
T T2
x (t ) dt
or
x (t ) dt
370
B. Variance of X(t)
The variance of a random process X(t) is given by
C. Autocorrelation of X(t)
Autocorrelation Function (ACF) of a RP x(t) is denoted
as either RXX(t1,t2) or RX(t1,t2) or RXX(t, t+)
Autocorrelation of a random process X(t) is given by
X2 (t ) Var X t
RXX t1 , t2 E X t1 X * t2
2
E X t X t
E X 2 t E X t
x1 x2 f x1 , x2 ; t1 , t2 dx1dx2
It follows that
a f
af af
RXX t1, t2 E X t2 X * t1
a f
RXX t, t E X t 0
371
372
D. Autocovariance of X(t)
The autocovariance of a RP X(t) is given by
CX t1,t2 E X t1 X t1 X t2 X t2
b a f a fg
E X t1 X t2
a f
a f a f
a f l a f a fql a f a fq
E Xat f Xat f at fXat f at fXat f at f at f
E Xat f Xat f at f at f
1
Thus
C X t1 , t2 RX t1 , t2 X t1 X t2
373
374
E. Correlation Coefficient
The correlation coefficient of a RP X(t) is given by
X f
x(t )e
dt
X t1 , t2
C X t1 , t2
C X t1 , t1 C X t2 , t2
where
X t1 , t2 1
x t
X ( f )e
df
But the above equations cannot be computed for realistic
samples of all RPs
A limited definition is required assuming ergodic process
375
376
Rx(t),
X (t ) S
T 0,
Hence
T t T
X T f TT xT (t )e
af
j 2 ft
dt
377
S X f
RX e j 2 f d
RX 0 E X 2 t
SX ( f )df
This is the area under the PSD curve. It is also known as the
Average Power
Conversely,
cos 2 f j sin 2 f d
RX cos 2 f d
RX j sin 2 f d
RX cos 2 f d
SX 0
378
S X ( f )e j2f df
af
RX
af z
WienerKhintchine
Theorem
RX S X f
Federal University of Technology, Minna
discrete
RX F 1 S X f
1
2
S X f
XT f
2T
RX
Conversely
R
a f |Sz R (k )e j2kf ,
|T
k
af
af
SX f
else
S X f lim 1 E X T f
T
2T
R ( )e j2f d , continuous
RX ( )d
SX ( f )e j2f df
Federal University of Technology, Minna
379
380
(c)
Prof.
OkeyUgweje
Ugweje
Prof.
Okey
Federal
University of Technology,
Minna
Federal University
of Technology,
Minna
382
f X t t f X t t c f X t
(c) Prof. Okey Ugweje
383
384
f X ( x1 , x2 , xn , t1 , t2 , tn )
f X ( x1 , x2 , xn , t1 c, t2 c , tn c)
where left side represents the joint pdf of the RVs
X n X (tn c).
(c)
Prof.
OkeyUgweje
Ugweje
Prof.
Okey
ti , i 1, 2, , n, n 1, 2, and any c.
Federal
University of Technology,
Minna
Federal University
of Technology,
Minna
385
386
E Xt E Xt
E X t X t
387
(c) Prof.
OkeyUgweje
Ugweje
Prof.
Okey
Federal
University of Technology,
Minna
Federal University
of Technology,
Minna
388
Note:
Autocorrelation Function
Stochastic Process
WSS
SSS
Aristotle
(c) Prof. Okey Ugweje
389
390
Proof:
bf
RX 0 E X 2 t
b f bf
RX E X t X t
but
RX RX
RX E X t X t
E X t X t
RX
bf b f
RX 0 RX
RX E X t X t
(c) Prof. Okey Ugweje
391
392
Consider
c b f b fh
E X t X t
b f
bf
bf b f
E X t E X t E X t X t
2 RX 0 2 RX 0
2
Hence
RX 0 R X
m X (t ) E X t kT
RX RX t kT
bf b f
RX E X t X t E A2 A2
5. If X(t) has a periodic component, then RX() will also
have a periodic component with the same period
(c) Prof. Okey Ugweje
393
394
bf
bf
Xn t E Xn t
2. Ergodic in Autocorrelation
bg bg
b g
X t1 X t2 RX t1, t2
395
396
Y1
Y2
Yk 1
bg bg
Xbt g Xbt g
X t2 X t1
3
Poison Process,
Weiner Process
If X(t) and Y(t) are such that the RVs X(t1), , X(tn)
and Y(t1) and Y(tn) are mutually independent, then the
processes are independent
bg b g
X tk X tk 1
397
398
bg
bg
P X t k xk |xk 1, , x1 P X t k xk |xk 1
399
400
Properties of CCF - 1
Thus
RXY E X t Y t
RYX E Y t X t
401
RXY RYX
Note that the above equation simply indicates symmetry. It
does not necessarily indicate that the CCF is even
The ACF of a RP is even but the CCF is not
Properties of CCF - 2
bf
b f
b f
E X 2 t E Y 2 t 2 E X t Y t 0
RX 0 RY 0 2 RXY 0
RXY 1 RX 0 RY 0
2
RXY E X t Yt E X t E Yt
E Yt E X t
E Yt X t
RYX
Note that in ACF the value at zero equals mean square value, but in
CCF the value at zero has no special significance
402
Properties of CCF - 3
RXY 1 RX 0 RY 0
2
403
404
Properties of CCF - 4
Properties of CCF - 5
XY
YX
XY
YX
b g b g
X t , Y t , , for all
XY
405
406
Cross-Covariance (CC)
The cross-covariance of two processes X(t) & Y(t) is
defined as
2
E X (t ) Y (t ) 0, for all t
b g m b g b grmYbt gm bt gr
R bt ,t g m bt gm bt g
CXY t1,t2 E X t1 mX t1
1 T
x(t ) y (t )dt
T 2T T
lim 1 TT y (t ) x(t )dt
T 2T
XY
XY lim
YX
Hence
XY RXY ,
(c) Prof. Okey Ugweje
b g
CXY t1,t2 0
YX RYX
Federal University of Technology, Minna
407
408
S XY
R|z R ( )e j2f d ,
a f f S R (k )e j2kf ,
|T
XY
XY
continuous
The most important questions of life are, for the
most part, really only problems of probability.
discrete
409
410
Prediction
411
412
Linear Network
h(t)
x(t)
x[n]
x(ejw)
X(f)
X(z)
RX(f)
SX(f)
y(t)
y[n]
Y(ejw)
Y(f)
Y(z)
Ry(f)
Sy(f)
h[n]
H(ejw)
H(f)
H(z)
y t h(t ) x(t )
h( ) x(t )d
Time Function
Difference Equation
h(t ) x( )d
Pole-Zero Plot
H - Function
Random Process
413
414
Y t
h( ) X (t )d
h(t
E Y 2 t E
X (t s )h( s )ds
X (t r )h(r )dr
ds
X (t s ) X (t r )h( s )h(r )dr
E
) X ( )d
ds
E X (t s ) X (t r ) h( s )h(r )dr
But
E X (t s ) X (t r ) RX (t s t r )
Mean:
RX ( r s )
Hence
E Y t E h( ) X (t )d h( ) E X (t ) d
E X t E X t mx
mx h( )d
415
416
SY f h s h r RX (u )e j 2 f ( u s r ) dsdrdu
Autocorrelation Function:
h s e j 2 fs ds h r e j 2 fr dr RX (u )e j 2 fu du
H ( f ) H ( f )S X ( f )
H ( f ) SX ( f )
RXY E X (t )Y (t )
SY f RY ( )e j 2 ft d
E X (t ) X t r h(r )dr
h s h r RX ( s r )e j 2 ft dsdrd
If we let u = +s-r,
we obtain
E X (t ) X t r h(r )dr
RX r h(r )dr
RX h( )
(c) Prof. Okey Ugweje
417
S XY f H f S X f
Since RXY() = RXY(-), we obtain
S XY f SYX
f H f SX f
419
418