Vous êtes sur la page 1sur 8

# EE302 Division 1

Homework 2 Solutions.
Problem 1. Prof. Pollak is flying from LA to Paris with two plane changes, in New York and London.
The probability to lose a piece of luggage is the same, p, in LA, NY, and London. Having arrived in
Paris, Prof. Pollak discovers that his suitcase is lost. Calculate the conditional probabilities that the
suitcase was lost in LA, NY, and London, respectively.
Solution. There are four outcomes in the sample space of this experiment carried out by Prof. Pollak:
NL: Suitcase was not lost
LD: Suitcase was lost in London
NY: Suitcase was lost in New York
LA: Suitcase was lost in LA
They can be described by a sequential tree in Fig. 1.
NL

1-p
1-p

(1-p) 3
LD

(1-p) 2 p

1-p
p

NY

(1-p)p
p
LA

## Figure 1: Sequential tree description for Problem 1.

We can see that
P({suitcase was lost}) = 1 P({suitcase was not lost}) = 1 P({N L}) = 1 (1 p)3 = p3 3p2 + 3p.
Since the event {suitcase was lost in LA} is a subset of the event {suitcase was lost}, the intersection
of these two events is {suitcase was lost in LA}. The same reasoning applies to the events {suitcase
was lost in NY} and {suitcase was lost in London}. Using the definition of conditional probability, we
therefore have that the conditional probabilities of having lost the suitcase in LA, NY, and London, are:
P({suitcase was lost in LA|suitcase was lost}) =
=
P({lost in NY|suitcase was lost}) =
=
1

P({LA})
P({suitcase was lost})
1
p
;
= 2
3
1 (1 p)
p 3p + 3
P({N Y })
P({suitcase was lost})
1p
(1 p)p
;
= 2
3
1 (1 p)
p 3p + 3

P({LD})
P({suitcase was lost})
(1 p)2 p
(1 p)2
= 2
.
3
1 (1 p)
p 3p + 3

## P({lost in London|suitcase was lost}) =

=

Problem 2. To encourage Elmers promising tennis career, his father offers him a prize if he wins two
tennis sets in a row in a three-set series to be played with his father and the club champion alternately:
father-champion-father, or champion-father-champion, according to Elmers choice. The champion is
a better player than Elmers father. Which series should Elmer choose? (Assume that the results of
the three tennis sets are independent.)
Solution. There are 23 = 8 outcomes in the sample space of all possible results of a three-set series:
= {W W W, W W L, W LW, W LL, LW W, LW L, LLW, LLL}.
The event {win two sets in a row}={W W W, W W L, LW W }. Suppose Elmer has probabilities f and
c to win against his father and the champion, respectively. Since the champion is a stronger player,
f > c. If Elmer chooses to play the father-champion-father series, the probability to win the prize is:
P({win two sets in a row}) = P({W W W, W W L, LW W })
= P({W W W }) + P({W W L}) + P({LW W })
= f cf + f c(1 f ) + (1 f )cf
= f c(2 f ).
Similarly, if he chooses to play champion-father-champion, the probability to win will be cf (2 c)
which is obviously larger than f c(2 f ). Therefore he must choose the champion-father-champion
series.
Problem 3. The face EGH of the tetrahedron FEGH is painted in three colors: red, green, and blue.
The face EFH is painted red. The face HFG is painted green. The face GFE is painted blue. Define
F

G
E
H

## the following events:

Ar = {a face picked at random has red on it}
Ag = {a face picked at random has green on it}
Ab = {a face picked at random has blue on it}
Are Ar , Ag , Ab pairwise independent? Are they independent?
Solution. Since each of the three colors appears on two out of the four equally likely faces, we have:
P(Ar ) = P(Ag ) = P(Ab ) = 2/4 = 0.5.
2

To check for pairwise independence, consider pairwise intersections of the three events. The probability
that a randomly picked face will have any two colors on it is the same as the probability that this face
is EGH because EGH has all three colors, and it is the only face with more than one color. Therefore,
P(Ar Ag ) = P({EGH}) = 1/4 = P(Ar )P(Ag ),
P(Ar Ab ) = P({EGH}) = 1/4 = P(Ar )P(Ab ),
P(Ab Ag ) = P({EGH}) = 1/4 = P(Ab )P(Ag ).
Hence these events are pairwise independent. However, they are not independent because
P(Ar Ag Ab ) = P({EGH}) = 1/4 6= P(Ar )P(Ag )P(Ab ).

Problem 4. A box contains two fair coins and one biased coin. For the biased coin, the probability
that any flip will result in a head is 1/3. Al draws two coins from the box at random, flips each of
them once, observes one head and one tail, and returns the coins to the box. Bo then draws one coin
from the box at random and flips it. The result it a tail. Determine the probability that neither Al
nor Bo removed the biased coin from the box.
Solution. The two experiments performed by Al and Bo are independent. We can therefore calculate
the probability that the biased coin is not removed by Al and that it is not removed by Bo, and then
multiply the two probabilities. Both individual probabilities are computed using Bayes rule. For Als
experiment, we need the conditional probability that both coins he picked were fair, given that he
observed one head and one tail:
P({both coins are fair| H and T}) =
=

P({H and T})
1/2 1/3
= 1/3.
(1)
1/2

## The explanation for these numbers is:

Regardless of which two coins Al picked, his probability to observe one H and one T is 1/2. If the
coins are fair, then the probability that the first coin results in H and the second one in T is 1/4,
and the probability that the first one results in T and the second one in H is also 1/4, producing
a total probability of 1/4 + 1/4 = 1/2 to get one H and one T. If one of the coins is biased,
the probability that it will result in H and the other coin will result in T is 1/3 1/2 = 1/6;
the probability that the biased coin will result in T and the other in H is 2/3 1/2 = 1/3. This
also gives a total probability of 1/6 + 1/3 = 1/2 of observing one H and one T. Therefore,
P({H and T| both coins are fair} = P({H and T}) = 1/2.
There are three equally likely ways of choosing two coins out of a set of three, because each coin
is equally likely not to be chosen. The probability that both Als coins are fair is the probability
that the biased coin is not chosen, which is 1/3.

## Bos probability is computed similarly,

P({fair| T}) =
=
=

P({ T| fair})P({fair})
P({T})
P({T| fair})P({fair})
P({T| fair})P({fair}) + P({T| biased})P({biased})
1/2 2/3
= 3/5.
1/2 2/3 + 2/3 1/3

(2)

Multiplying the two probabilities computed in Eqs. (1) and (2), we get that the conditional probability
that the biased coin was not removed, given the observations, is 1/3 3/5 = 1/5.
Problem 5 Die A has five olive faces and one lavender face; die B has three faces of each of these
colors. A fair coin is flipped once. If it falls heads, the game continues by throwing die A alone; if
it falls tails, die B alone is used to continue the game. However awful their face colors may be, it is
known that both dice are fair.
(a) Determine the probability that the n-th throw of the die results in olive.
(b) Determine the probability that both the n-th and (n + 1)-st throws of the die result in olive.
(c) If olive readings result from all the first n throws, determine the conditional probability of an
olive outcome on the (n + 1)-st throw. Interpret your result for large values of n.
Solution. Let H = {coin falls heads}, T = {coin falls tails}, and On = {n-th throw of the die is olive}.
(a) By the total probability theorem, we have:
P(On ) = P(On |H)P(H) + P(On |T )P(T )

(3)

Given that H happens, die A is being thrown and P(On |H) = 5/6. Otherwise T must occur
and P(On |T ) = 1/2 because die B is being thrown. Therefore it follows from Eq. (3) that
P(On ) =

5 1 1 1
2
+ = .
6 2 2 2
3

## (b) This is another application of the total probability theorem.

P(On On+1 ) = P(On On+1 |H)P(H) + P(On On+1 |T )P(T )

(4)

Given the result of the coin flip, a die is chosen and the following throws of the die are conditionally independent. This conditional independence means that
2
5
,
P(On On+1 |H) = P(On |H)P(On+1 |H) =
6
2
1
P(On On+1 |T ) = P(On |T )P(On+1 |T ) =
.
2

## From Eq.(4), we have:

2
2
1
1
5
1
17
P(On On+1 ) =
+
= .
6
2
2
2
36
Note that it would be wrong to compute this probability by squaring the result from Part (a): while
getting an olive on the n-th throw and an olive on the (n + 1)-st throw are two events that are conditionally independent, conditioned on our knowledge of which die we are throwing, these events are not
unconditionally independent. Indeed, if we do not know whether we are throwing die A or die B, an olive
reading makes it more likely that we are throwing die A, and therefore it becomes more likely that the
result of the next throw will also be olive.

## (c) We need to calculate the following conditional probability:

n
!
n+1 ! , n
!

\
\
\

P
P On+1
Ok = P
Ok
Ok .

k=1

k=1

(5)

k=1

By the total probability theorem and conditional independence, for any m > 0 we have:
m
!
m
!
m
!
\
\
\
P
= P
Ok
Ok |H P(H) + P
Ok |T P(T )
k=1

k=1

m
m
5
1
1
1
=
+
.
6
2
2
2

k=1

## Substituting this into the right-hand side of Eq. (5) we get:

n

! n+1 n+1
\
5
+ 1

P On+1
Ok = 6 5 n 21 n .

+ 2
6
k=1

To interpret this for large values of n, we multiply both the numerator and the denominator by
(6/5)n , and let n go to infinity:
n
!

6 n 1
\
5

6 + 10 2
lim P On+1
= lim
Ok
n
n
n 1 + 6

10
k=1

5
.
6

This result makes sense because if we see a long sequence of olives, it is much more likely that
we are throwing die A, which would make the probability that the next reading is olive 5/6.

has a probability of 0.5 of being out of service. Towns A and B can communicate as long as they are
connected in the communication network by at least one path which contains only in-service links.
Determine, in an efficient manner, the probability that A and B can communicate.
Solution. As discussed in class, a convenient method of solving such problems is to break down
the system into smaller subsystems and apply the rules for parallel and series connections which are
illustrated in Fig. 2. The pi s in the figure are the probabilities of failure of the corresponding links.
p1

p2

p1

pn

p2

pn

## (b) Connection in series

Figure 2: Problem 6.
For the parallel case, the probability of failure of the entire link is the probability that all the links
fail, which is:
p=

n
Y

pi .

i=1

For the series case, the probability that the entire link is OK is the probability that all the links are
OK:
n
Y
1p=
(1 pi ),
i=1

0.5

0.5
0.5
0.5
A

0.5

0.5

0.5

0.5 2

0.5 3

0.5
0.5

1 - (1 - 0.5)(1 - 0.5 2 ) = 5 / 8
0.5

0.5

(a)

(b)

0.5

5/8

0.5 3

0.5 5 / 8 = 5 / 16

5/16

0.5 3

51/128

0.5

0.5

0.5

51/256

(c)

(d)

(e)

(f)

## Figure 3: Reduction process of Problem 6.

where p is the probability of failure of the entire link. Based on the two equations above, the original
network in this problem can be reduced step by step, as illustrated in Fig. 3, from (a) to (f). Each
number in the figure is the probability of failure of the corresponding link. For example, in order to
get from the original network depicted in Fig. 3(a), to the equivalent network of Fig. 3(b), we replace
the two parallel connections of links enclosed in dashed boxes, with equivalent links. The first one
contains two parallel links whose probabilities of failure are 1/2, which means that the probability of
failure of the entire link is 1/4. Similarly, the second one contains three parallel links, to result in the
probability of failure 1/8. To go from Fig. 3(b) to Fig. 3(c), we replace the boxed series connection with
an equivalent link. Since the probabilities of failure of the two links are 1/2 and 1/4, the probabilities
of success are 1/2 and 3/4, respectively. In order for the whole link to be OK, both components have
to be OK, which means that the probability of success of the equivalent link it 1/2 3/4 = 3/8, and
its probability of failure is 1 3/8 = 5/8.
Proceeding to combine the links in a similar fashion, we finally get to Fig. 3(f) which gives us the
probability of failure of the entire link from A to B, which is equal to 51/256. Therefore the probability
of success is:
1 51/256 = 205/256 0.8008.

Problem 7. A hiker leaves the point O, choosing one of the roads OB, OC, OD, OE at random. At
each subsequent crossroads he again chooses a road at random. What is the probability of the hiker
arriving at the point A?

A
Solution. By the total probability theorem:
P(A) = P(A|B)P(B) + P(A|C)P(C) + P(A|D)P(D) + P(A|E)P(E)
1 2 1
1 1 1 1
+ +1 +
=
3 4 2 4
4 5 4
10
15
30
12
=
+
+
+
120 120 120 120
67
.
=
120
(Here, we denote the event that the hiker visits the point A, B, C, D, E, by the same letter as the
point itself.)