Vous êtes sur la page 1sur 14

UCLA EE131A (KY) 1

EE 131A Probability
Professor Kung Yao
Electrical Engineering Department
University of California, Los Angeles
Lecture 4-2
UCLA EE131A (KY) 2
Conditional Probability is a Probability
P(A|B) satisfies all the axioms of probability.
Axiom I. 0 s

P(A|B) .
Axiom II. P(S|B) = 1 .
Axiom III. If A
1
, A
2
, , are mutually exclusive (or
disjoint) (i.e., A
i
A
j
= C, for all i and j), then
( )
i i
i=1 i=1
P P . | |

=
| |
|
\ .

A B A B

UCLA EE131A (KY) 3


Conditional Probability (1)
Ex. 1. Binary Symmetric Channel (BSC)
Consider the simplest model for a binary digital
communication channel for transmitting an input of
either an 0 or an 1. Denote A
0
for a channel input
of an 0 and A
1
for an input of an 1. The output can
be either an 0 or an 1. Denote B
0
for a channel
output of an 0 and B
1
for an output of 1. Since the
channel is imperfect (i.e., due to noise and other
disturbance), sending any A
i
, i = 0, 1, can result in any
B
j
, j = 0, 1.
UCLA EE131A (KY) 4
Conditional Probability (2)
Let the probability of receiving B
0
given A
1
was sent,
be denoted by the conditional probability
P(B
0
|A
1
) = c, (c

~ small) (1)
Probability of receiving B
1
given A
1
was sent becomes
P(B
1
|A
1
) = P(B
0
C
|A
1
) = 1 - c. (2)
Channel is symmetric if prob. of receiving B
1
given A
0
P(B
1
|A
0
) = c. (3)
Then the probability of receiving B
0
given A
0
becomes
P(B
0
|A
0
) = P(B
1
C
|A
0
) = 1 - c. (4)
A
i
B
j
c
1-c
1-c
c
0
1 1
0
BSC
j
UCLA EE131A (KY) 5
Theorem on total probability (1)
Let {A
1
, A
2
, , A
n
} be n mutually exclusive events
whose union is the sample space S. This set of events
forms a partition of S.
For any event B
In figure, B has intersection with {A
3
,A
4
,A
5
,A
6
,A
7
}
A
1
A
2
A
3
A
4 A
n
A
5
A
6
A
7
A
8
B
S
n
1 2 n i
i=1
n 1 2
B = B S = B (A A A )
= (B A ) (B A ) (B A ) = (B A )

3 4 5 6 7
B = B S = (B A ) (B A ) (B A ) (B A ) (B A ).
UCLA EE131A (KY) 6
Theorem on total probability (2)
Since all the are mutually exclusive, then we can
use Axiom III to obtain
Equation (1) is called the theorem on total probability.
This result is needed in the development of the Bayes
rule to be discussed next.
i
(B A )
n
i
i=1
n
i
i=1
1 n
P(B) = P (B A )
P(B A ) + + P(B A ) . (1)
P(B A )
=
| |
=
|
\ .

UCLA EE131A (KY) 7


Theorem on total probability (3)
Ex. 1. There are three urns. The first one has a white
balls and b black balls. The second one has c white
balls and d black balls. The third one has only one
white ball. A ball is picked randomly from one of
the urn. What is the probability that this ball will
be white?
Denote the event W = {A white ball shows up
Let the event U
1
= {The first urn is picked};
U
2
= {The second urn is picked};
U
3
= {The third urn is picked}.
UCLA EE131A (KY) 8
Theorem on total probability (4)
P(U
1
) = P(U
2
) = P(U
3
) = 1/3; P(W|U
1
) = a/(a+b);
P(W|U
2
) = c/(c+d); P(W|U
3
) = 1 .
From theorem on total probability,
P(W) = P(W|U
1
) P(U
1
) + P(W|U
2
) P(U
2
)
+ P(W|U
3
) P(U
3
)
= (1/3)(a/(a+b)) + (1/3)(c/(c+d))+ (1/3)1
= (1/3)(a/(a+b) + c/(c+d) + 1)
UCLA EE131A (KY) 9
Bayes Rule (1)
Let {A
1
, A
2
, , A
n
} be a partition of a sample space S.
Then Bayes rule shows how the conditional prob.
P(A
j
|B) can be obtained from P(A
j
) and P(B|A
j
) as
j
j
j j
n
k
k=1
j j
n
k k
k=1
P( ) =
P( )
| Def. of cond. prob.
P( )
P( | )P( )
Def. of cond. prob.
=
Th. of total prob.
P( )
P( | )P( )
Nothing new
=
Def
P( | )P( )
:
:
:
:
:

A B
B

BA A
B A
BA A
BA A

. of cond. prob.
UCLA EE131A (KY) 10
Bayes Rule (2)
Suppose {A
1
, A
2
, , A
n
} is a partition of all outcomes of
an experiment. Furthermore, (from prior exp.s
relative freq. data), we know P(A
j
), j = 1,,n. We call
the P(A
j
)s as the prior (or a priori) prob. Suppose
we perform another exp. and observed an event B.
Furthermore, from this exp., we are able to evaluate
P(B|A
j
), j = 1,, n. Bayes rule gives the ability to
find the reversed conditional probability P(A
j
|B),
called the a posteriori probabilities, using P(A
j
) and
P(B|A
j
).
j j j j j
j
n n
k k k
k=1 k=1
P( ) P( | )P( ) P( | )P( )
P( | ) = = = .
P( )
P( ) P( | )P( )

B A BA A BA A
A B
B
B A BA A

UCLA EE131A (KY) 11


Bayes Rule (3)
Ex. 2. BSC (Previous Ex. 1 continued).
Let the input data A
0
= 0 and A
1
= 1 be the partition of S
with equal prior prob. P(A
0
) = P(A
1
) = . Suppose
we can measure the quality of the BSC by sending
say 10
6
0s and see how many showed up as 1s.
Suppose it is 350. Then we have
P(B
1
|A
0
) = 350/10
6
= c; P(B
0
|A
0
) = 1- c.
A
i
B
j
c
1-c
1-c
c
0
1 1
0
BSC
UCLA EE131A (KY) 12
Bayes Rule (4)
Ex. 2. BSC (continued).
If the channel is symmetric, then
P(B
0
|A
1
) = c; P(B
1
|A
1
) = 1- c.
In practice, we can send 10
6
1s, and see how many
0s show up. If we received say 360 0s, we can
probably assume the channel is symmetric. Now,
1 1 1 1
1 1
1 0 1 0 1
0 0 1 1
P( ) = P( ( )) = P( ) + P( )
= P( )P( ) + P( )P( ) = /2 + (1-)/2 = 1/2 .
P( ) =
| |
B S B A A B A B A
B A A B A A
B
0 0 0 0 0 1 1
P( ) = P( | )P( ) + P( | )P( ) = (1-)/2 + /2 = 1/2 . B B A A B A A
UCLA EE131A (KY) 13
Bayes Rule (5)
Ex. 2. (BSC) Evaluate the two a posteriori probabilities
A Posteriori Prob. Decision Rule
If B
1
is received, decide A
1
was sent if
P(A
1
| B
1
) > P(A
0
| B
1
) .
Otherwise, decide A
0
was sent if
P(A
0
| B
1
) > P(A
1
| B
1
) .
In Ex. 2, if c

<< , then clearly
P(A
1
| B
1
) = 1 - c

>> c

= P(A
0
| B
1
) .
Thus, if B
1
is received, decide A
1
was sent.
0 0 0
0
1 1
1
1 1
1 1 1 1 1
1 1
1 1
P( ) P( | )P( )
P( | ) = = = .
P( ) P( )
/2
=
1/2
P( ) P( | )P( ) (1-)/2
P( | ) = = = = (1-) .
P( ) P( ) 1/2

A B B A A
A B
B B
A B B A A
A B
B B

UCLA EE131A (KY) 14


Bayes Rule (6)
Ex. 3. Consider a radar system in the detection of either
signal plus noise (S+N) or N only. Denote P(S+N) =
p; P(N) = p
C
= 1- p; P(S|S+N) = p
1
; P(S|N) = p
2
. What
is the probability of S + N given S?
.
1
2 1
P((S+N)S) P(S|S+N)P(S+N)
P(S+N|S) = =
P(S) P(S)
P(S|S+N))P(S+N)
=
P(SN) +P(S(S+N))
P(S|S+N))P(S+N)
=
P(S|N)P(N) +P(S|(S+N))P(S+N)
p p
=
p (1-p)+p p

Vous aimerez peut-être aussi