Académique Documents
Professionnel Documents
Culture Documents
Chapter 4
Probability
LEARNING OBJECTIVES
The main objective of Chapter 4 is to help you understand the basic principles of
probability, specifically enabling you to
1.
2.
3.
4.
Solve problems using the laws of probability, including the law of addition, the
law of multiplication , and the law of conditional probability.
5.
Chapter 4: Probability
attempt has been made to differentiate the several types of probabilities so that students
can sort out the various types of problems.
In teaching students how to construct a probability matrix, emphasize that it is
usually best to place only one variable along each of the two dimensions of the matrix.
(That is place Mastercard with yes/no on one axis and Visa with yes/no on the other
instead of trying to place Mastercard and Visa along the same axis).
This particular chapter is very amenable to the use of visual aids. Students enjoy
rolling dice, tossing coins, and drawing cards as a part of the class experience.
Of all the chapters in the book, it is most imperative that students work a lot of
problems in this chapter. Probability problems are so varied and individualized that a
significant portion of the learning comes in the doing. Experience is an important factor
in working probability problems.
Section 4.8 on Bayes theorem can be skipped in a one-semester course without
losing any continuity. This section is a prerequisite to the chapter 18 presentation of
revising probabilities in light of sample information (section 18.4).
CHAPTER OUTLINE
4.1
Introduction to Probability
4.2
4.3
Structure of Probability
Experiment
Event
Elementary Events
Sample Space
Unions and Intersections
Mutually Exclusive Events
Independent Events
Collectively Exhaustive Events
Complimentary Events
Counting the Possibilities
The mn Counting Rule
Sampling from a Population with Replacement
Combinations: Sampling from a Population Without Replacement
4.4
Chapter 4: Probability
4.5
Addition Laws
Probability Matrices
Complement of a Union
Special Law of Addition
4.6
Multiplication Laws
General Law of Multiplication
Special Law of Multiplication
4.7
Conditional Probability
Independent Events
4.8
KEY TERMS
A Priori
Bayes' Rule
Classical Method of Assigning Probabilities
Collectively Exhaustive Events
Combinations
Complement of a Union
Complementary Events
Conditional Probability
Elementary Events
Event
Experiment
Independent Events
Intersection
Joint Probability
Marginal Probability
mn Counting Rule
Mutually Exclusive Events
Probability Matrix
Relative Frequency of Occurrence
Sample Space
Set Notation
Subjective Probability
Union
Union Probability
4.1
D2 D3,
D2 A4,
D2 A5,
D2 A6,
D3 A5
D3 A6
A4 A5
A4 A6
Chapter 4: Probability
D1 A6, D3 A4, A5 A6
There are 15 members of the sample space
The probability of selecting exactly one defect out of
two is:
9/15 = .60
4.2
4.3
If A = {2, 6, 12, 24} and the population is the positive even numbers through 30,
A = {4, 8, 10, 14, 16, 18, 20, 22, 26, 28, 30}
4.4
6(4)(3)(3) = 216
4.5
D1 D2 A2,
D1 A1 A2,
D1 A2 A3,
D2 A1 A2,
D2 A2 A3,
A1 A2 A3,
A2 A3 A4
D1 D2 A3,
D1 A1 A3,
D1 A2 A4,
D2 A1 A3,
D2 A2 A4,
A1 A2 A4,
Combinations are used to counting the sample space because sampling is done
without replacement.
Chapter 4: Probability
C3 =
6!
= 20
3!3!
.60
12/20 = 3/5
There are 20 members of the sample space and 12 of them have 1 defective part.
4.6
4.7
20
C6 =
20!
= 38,760
6!14!
4.9
D
12
25
10
20
15
23
16
21
60
Chapter 4: Probability
4.10
a)
b)
c)
d)
4.11
.10
.03
.13
.04
.12
.16
.27
.06
.33
.31
.07
.38
.72
.28
1.00
P(T) = .28
4.13
P(M) = .78
P(M L) = .61
Chapter 4: Probability
P(T _ F) = .35
Y
.35
N
.19
.54
.09
.37
.46
.44
.56
1.00
4.15
a)
b)
c)
d)
P(A _
P(D _
P(D _
P(A _
11
16
40
17
14
21
15
57
E) =
B) =
E) =
B) =
16/57 =
3/57 =
.0000
.0000
.2807
.0526
4.16
D
.12
.13
.08
.33
.18
.09
.04
.31
.06
.24
.06
.36
.63
Chapter 4: Probability
.36
.46
.18
1.00
a) P(E _ B) = .09
b) P(C _ F) = .06
c) P(E _ D) = .00
4.17
6 5
30
= .0122
=
50 49 2450
b) (with replacement)
P(D1 _ D2) = P(D1) P(D2) =
4.18
6 6
36
= .0144
=
50 50 2500
Let U = Urban
I = care for Ill relatives
a) P(U _ I) = P(U) P(I U)
P(U) = .78
P(I) = .15
P(IU) = .11
P(U _ I) = (.78)(.11) =
.0858
b) P(U _ NI) = P(U) P(NIU) but P(IU) = .11
So, P(NIU) = 1 - .11 = .89 and P(U _ NI) =
P(U) P(NIU) = (.78)(.89) = .6942
c)
U
Yes
I
No
Yes
No
.15
.85
.78
.22
The answer to a) is found in the YES-YES cell. To compute this cell, take 11%
Chapter 4: Probability
or .11 of the total (.78) people in urban areas. (.11)(.78) = .0858 which belongs in
the YES-YES" cell. The answer to b) is found in the Yes for U and no for I cell.
It can be determined by taking the marginal, .78, less the answer for a), .0858.
d. P(NU _ I) is found in the no for U column and the yes for I row (1st row and
2nd column). Take the marginal, .15, minus the yes-yes cell, .0858, to get
.0642.
4.19
Let S = stockholder
Let C = college
P(S) = .43
P(C) = .37
P(CS) = .75
P(P) = .52
P(PF) = .91
4.21
Let S = safety
NP
.091
.009
.10
NF
.429
.471
.90
.520
.480
1.00
Chapter 4: Probability
10
Let A = age
P(S) = .30
P(A) = .39
P(A S) = .87
P(O) = .29
P(C O) = .13
15
12
35
11
17
19
47
21
32
27
80
18
13
12
43
65
74
66
205
Chapter 4: Probability
11
.36
.44
.80
.11
.09
.20
.47
.53
1.00
Computer
Yes
No
Yes
46
49
No
11
15
26
57
18
75
Chapter 4: Probability
12
4.26
Let C = construction
Let S = South Atlantic
83,384 total failures
10,867 failures in construction
8,010 failures in South Atlantic
1,258 failures in construction and South Atlantic
a) P(S) = 8,010/83,384 = .09606
b) P(C S) = P(C) + P(S) - P(C _ S) =
10,867/83,384 + 8,010/83,384 - 1,258/83,384 = 17,619/83,384 = .2113
1258
P(C S ) 83,384
c) P(C S) =
= .15705
=
8010
P( S )
83,384
1258
P(C S ) 83,384
= .11576
d) P(S C) =
=
10,867
P (C )
83,384
e) P(NSNC) =
P( NS NC ) 1 P(C S )
=
P ( NC )
P( NC )
P( NS C ) P(C ) P (C S )
=
P (C )
P(C )
Let E = Economy
Let Q = Qualified
Chapter 4: Probability
P(E) = .46
P(Q) = .37
13
P(E _ Q) = .15
P(TA) = .81
Let H = hardware
Let S = software
P(H) = .37
P(S) = .54
P(SH) = .97
Chapter 4: Probability
14
Event
A
B
C
Prior
Conditional
P(Ei)
.10
.40
.50
P(DEi)
.05
.12
.08
Joint
P(D _ Ei)
.005
.048
.040
P(D)=.093
4.31
Let
P(A) = .30
P(B) = .45 P(C) = .25
P(IA) = .20 P(IB) = .12
P(IC) = .05
P(KA) = .80 P(KB) = .88
P(KC) = .95
a) P(B) = .45
b) P(KC) = 1 - P(IC) = 1 - .05 = .95
P(DC) = .08
Revised
.005/.093=.0538
.048/.093=.5161
.040/.093=.4301
Chapter 4: Probability
15
c)
Event
Prior
Conditional
A
B
C
P(Ei)
.30
.45
.25
P(IEi)
.20
.12
.05
Joint
P(I _ Ei)
Revised
P(EiI)
.0600/.1265=.4743
.0540/.1265=.4269
.0125/.1265=.0988
.0600
.0540
.0125
P(I)=.1265
Prior
Conditional
A
B
C
P(Ei)
.30
.45
.25
P(KEi)
.80
.88
.95
4.32
Let
P(T) = .72
Joint
P(K _ Ei)
Revised
.2400
.3960
.2375
P(K)=.8735
P(EiK)
.2400/.8735=.2748
.3960/.8735=.4533
.2375/.8735=.2719
Event
Prior
Conditional
A
B
P(Ei)
.72
.28
P(VEi)
.30
.20
Joint
P(V _ Ei)
.216
.056
P(V)=.272
P(VG) = .20
Revised
P(EiV)
.216/.272=.7941
.056/.272=.2059
Chapter 4: Probability
4.33
16
Let T = training
Let S = small
P(T) = .65
P(ST) = .18 P(NST) = .82
P(SNT) = .75 P(NSNT) = .25
Event
Prior
Conditional
T
NT
P(Ei)
.65
.35
P(NSEi)
.82
.25
Joint
P(NS _ Ei)
Revised
P(EiNS)
.5330/.6205=.8590
.0875/.6205=.1410
.5330
.0875
P(NS)=.6205
4.34
Variable 1
Variable 2
10
20
15
30
15
55
40
95
Chapter 4: Probability
17
i) P(A) = P(AD)??
30/95 = 10/95 ??
.31579 .18182
No, Variables 1 and 2 are not independent.
4.35
D
12
31
22
10
25
21
18
16
23
78
Gender
<35
35-44
45-54
55-64
>65
Male
.11
.20
.19
.12
.16
.78
Female
.07
.08
.04
.02
.01
.22
.18
.28
.23
.14
.17
1.00
a) P(35-44) = .28
b) P(Woman _ 45-54) = .04
Chapter 4: Probability
18
c) P(Man 35-44) = P(Man) + P(35-44) - P(Man _ 35-44) = .78 + .28 - .20 = .86
d) P(<35 55-64) = P(<35) + P(55-64) = .18 + .14 = .32
e) P(Woman45-54) = P(Woman _ 45-54)/P(45-54) = .04/.23= .1739
f) P(NW _ N 55-64) = .11 + .20 + .19 + .16 = .66
4.37
Let T = thoroughness
Let K = knowledge
P(T) = .78
P(K) = .40
P(T _ K) = .27
P(L) = .61
P(R _ L) = .33
P(TNE) = .17
Chapter 4: Probability
19
Let M = Mastercard
P(M) = .30
P(A) = .20
P(M _ A) = .08
P(V) = .25
P(V _ M) = .12
P(A _ V) = .06
Chapter 4: Probability
20
a)
b)
4.42
<45
.189
.381
.57
>45
.301
.129
.43
.490
.510
1.00
No
Chapter 4: Probability
21
Yes
.3483
.0817
.43
No
.1017
.4683
.57
.4500
.5500
1.00
Save
4.43
Let R = read
Let B = checked in the with boss
P(R) = .40
P(B) = .34
P(BR) = .78
NB
.312
.088
.40
NR
.028
.572
.60
.340
.660
1.00
4.44
Let: D = denial
I = inappropriate
C = customer
P = payment dispute
S = specialty
G = delays getting care
Chapter 4: Probability
22
R = prescription drugs
P(D) = .17
P(S) = .10
P(I) = .14
P(G) = .08
P(C) = .14
P(R) = .07
P(P) = .11
P(RP) = .90
Let M = mail
S = sales
P(M) = .38 P(M _ S) = .0000
Chapter 4: Probability
23
S
T
J
4.48
Prior
Conditional
P(Ei)
.41
.32
.27
P(BEi)
.05
.08
.06
Joint
P(B _ Ei)
.0205
.0256
.0162
P(B)=.0623
Revised
P(BiNS)
.329
.411
.260
Let R = regulations
T = tax burden
P(R) = .30
P(T) = .35
P(TR) = .71
Soup
Breakfast
Prior
Conditional
P(Ei)
.60
P(NSEi)
.73
Joint
P(NS _ Ei)
.4380
Revised
P(EiNS)
.8456
Chapter 4: Probability
Meats
Hot Dogs
4.50
.35
.05
.17
.41
24
.0595
.0205
.5180
P(GH) = .29
P(HM) = .21
P(FG) = .40
.1149
.0396