Vous êtes sur la page 1sur 51

Bachelor in Civil

Engineering

Theory of
Probability

Earthquake Resistant
Design of Structures

Contents
1. Introduction
2. Simple Definitions
3. Types of Probability
4. Theorems of Probability
5. Probabilities under conditions of
statistically independent events
6. Probabilities under conditions of
statistically dependent events
7. Bayes Theorem
8. Glossary of Terms
2
Introduction
If an experiment is repeated under
essentially homogeneous & similar
conditions we generally come across 2
types of situations:
Deterministic/ Predictable: - The
result of what is usually known as the
outcome is unique or certain.
Example:- The velocity v of a particle
after time t is given by
v = u + at
Equation uniquely determines v if the right-
hand quantities are known.
3
Unpredictable/ Probabilistic: - The result is not
unique but may be one of the several possible
outcomes.

Examples: -

(i) In tossing of a coin one is not sure if a head or a tail
will be obtained.

(ii) If a light tube has lasted for t hours, nothing can
be said about its further life. It may fail to
function any moment.
4
5
Simple Definitions
Trial & Event

Example: - Consider an experiment which, though
repeated under essentially identical conditions, does not
give unique results but may result in any one of the
several possible outcomes.

Experiment is known as a Trial & the outcomes are
known as Events or Cases.
Throwing a die is a Trial & getting 1 (2,3,,6) is an event.
Tossing a coin is a Trial & getting Head (H) or Tail (T) is
an event.
6
Exhaustive Events: - The total number of possible
outcomes in any trial.
In tossing a coin there are 2 exhaustive cases, head & tail.
In throwing a die, there are 6 exhaustive cases since any
one of the 6 faces 1,2,,6 may come uppermost.


Experiment Collectively Exhaustive Events
In a tossing of an unbiased coin Possible solutions Head/ Tail
Exhaustive no. of cases 2
In a throw of an unbiased cubic
die
Possible solutions 1,2,3,4,5,6
Exhaustive no. of cases 6
In drawing a card from a well
shuffled standard pack of playing
cards
Possible solutions Ace to King
Exhaustive no. of cases 52
7
Favorable Events/ Cases: - It is the number of outcomes
which entail the happening of an event.
In throwing of 2 dice, the number of cases favorable to
getting the sum 5 is:
(1,4), (4,1), (2,3), (3,2).
In drawing a card from a pack of cards the number of cases
favorable to drawing an ace is 4, for drawing a spade is 13 &
for drawing a red card is 26.

Independent Events: - If the happening (or non-
happening) of an event is not affected by the
supplementary knowledge concerning the occurrence of
any number of the remaining events.
In tossing an unbiased coin the event of getting a head in
the first toss is independent of getting a head in the second,
third & subsequent throws.

8
Mutually exclusive Events: - If the happening of any one
of the event precludes the happening of all the others.
In tossing a coin the events head & tail are mutually
exclusive.
In throwing a die all the 6 faces numbered 1 to 6 are mutually
exclusive since if any one of these faces comes, the possibility
of others, in the same trial, is ruled out.

Experiment Mutually Exclusive Events
In a tossing of an unbiased
coin
Head/ Tail
In a throw of an unbiased
cubic die
Occurrence of 1 or 2 or 3 or 4 or 5 or
6
In drawing a card from a well
shuffled standard pack of
playing cards
Card is a spade or heart
Card is a diamond or club
Card is a king or a queen
9
Equally likely Events: - Outcomes of a trial are said to be
equally likely if taken into consideration all the relevant
evidences, there is no reason to expect one in preference to
the others.
In tossing an unbiased coin or uniform coin, head or tail are
equally likely events.
In throwing an unbiased die, all the 6 faces are equally likely to
come.

Experiment Collectively Exhaustive Events
In a tossing of an unbiased
coin
Head is likely to come up as a Tail
In a throw of an unbiased
cubic die
Any number out of 1,2,3,4,5,6 is
likely to come up
In drawing a card from a well
shuffled standard pack of
playing cards
Any card out of 52 is likely to come
up
10
Probability: Probability of a given event is an
expression of likelihood of occurrence of an event.
Probability is a number which ranges from 0 to 1.
Zero (0) for an event which cannot occur and 1 for an
event which is certain to occur.

Importance of the concept of Probability

Probability models can be used for making predictions.
Probability theory facilitates the construction of
econometric model.
It facilitates the managerial decisions on planning and
control.


11
Types of Probability
There are 3 approaches to probability, namely:

1. The Classical or a priori probability
2. The Statistical or Empirical probability
3. The Axiomatic probability
12
Mathematical/ Classical/ a priori Probability
Basic assumption of classical approach is that the
outcomes of a random experiment are equally likely.

According to Laplace, a French Mathematician:
Probability, is the ratio of the number of favorable cases to
the total number of equally likely cases.

If the probability of occurrence of A is denoted by
p(A), then by this definition, we have:


Number of favorable cases m
p = P(E) = ------------------------------ = ----
Total number of equally likely cases n
13
Probability p of the happening of an event is also
known as probability of success & q the non-
happening of the event as the probability of failure.

If P(E) = 1, E is called a certain event &
if P(E) = 0, E is called an impossible event

The probability of an event E is a number such that
0 P(E) 1, & the sum of the probability that an event
will occur & an event will not occur is equal to 1.
i.e., p + q = 1


14
Classical probability is often called a priori probability
because if one keeps using orderly examples of
unbiased dice, fair coin, etc. one can state the answer in
advance (a priori) without rolling a dice, tossing a coin
etc.

Classical definition of probability is not very
satisfactory because of the following reasons:
It fails when the number of possible outcomes of the
experiment is infinite.
It is based on the cases which are equally likely and as
such cannot be applied to experiments where the
outcomes are not equally likely.
Limitations of Classical definition
15
It may not be possible practically to enumerate all
the possible outcomes of certain experiments and
in such cases the method fails.

Example it is inadequate for answering questions such
as: What is the probability that a man aged 45 will die
within the next year?

Here there are only 2 possible outcomes, the
individual will die in the ensuing year or he will live.
The chances that he will die is of course much
smaller than he will live.

How much smaller?

16
Relative/ Statistical/ Empirical Probability
Probability of an event is determined objectively by repetitive
empirical observations/ Experiments. Probabilities are assigned a
posterior.

According to Von Mises If an experiment is performed repeatedly
under essentially homogeneous conditions and identical
conditions, then the limiting value of the ratio of the number of
times the event occurs to the number of trials, as the number of
trials becomes indefinitely large, is called the probability of
happening of the event, it being assumed that the limit is finite
and unique.
Example: - When a coin is tossed, what is the probability that the coin
will turn heads?
Suppose coin is tossed for 50 times & it falls head 20 times, then the
ratio 20/50 is used as an estimate of the probability of heads of this
coin.
17
Symbolically, if in n trials an event E happens m times,
then the probability p of the happening of E is given
by



In this case, as the number of trails increase
probabilities of outcomes move closer to the real
probabilities and tend to be real probabilities as the
number of trails tends to infinity (a large number).

The empirical probability approaches the classical
probability as the number of trails becomes
indefinitely large.


m
p = P(E) = Lt ----
N -> N
18
The Empirical probability P(A) defined earlier can never
be obtained in practice and we can only attempt at a close
estimate of P(A) by making N sufficiently large.

The experimental conditions may not remain essentially
homogeneous and identical in a large number of
repetitions of the experiment.

The relative frequency of m/N, may not attain a unique
value, no matter however large N may be.
Limitations of Statistical/ Empirical method
19
The Axiomatic Approach
Modern theory of probability is based on the axiomatic
approach introduced by the Russian Mathematician A. N.
Kolmogorov in 1930s.

Classical approach restricts the calculation of probability to
essentially equally likely & mutually exclusively events.

Empirical approach requires that every question be examined
experimentally under identical conditions, over a long period
of time considering repeated observations.

Axiomatic approach is largely free from the inadequacies of
both the classical & empirical approaches.

20
Given a sample space of a random experiment, the
probability of the occurrence of any event A is defined as a
set function P(A) satisfying the following axioms.
1. Axiom 1: - P(A) is defined, is real and non-negative i.e.,
P(A) 0 (Axiom of non-negativity)
2. Axiom 2: - P(S) = 1 (Axiom of certainty)
3. Axiom 3: - If A1, A2, ., An is any finite or infinite sequence of
disjoint events of S, then




n n
P ( U A
i
) = P( A
i
)

i=1 i=1
21
The Objective and Subjective Approach
Objective approach to probability is arrived on opinion
basis or an empirical basis.
It is given by the ratio of frequency of an outcome to the total
number of possible outcomes.

Subjective approach to probability is not concerned
with the relative or expected frequency of an outcome.
It is concerned with the strength of a decision makers belief
that an outcome will not occur.
It is particularly oriented towards decision-making situations.
22
Theorems of Probability
There are 2 important theorems of probability
which are as follows:

The Addition Theorem and
The Multiplication Theorem
23
Addition theorem when events are Mutually Exclusive
Definition: - It states that if 2 events A and B are mutually
exclusive then the probability of the occurrence of either A
or B is the sum of the individual probability of A and B.
Symbolically



The theorem can be extended to three or more mutually
exclusive events. Thus,



P(A or B) or P(A U B) = P(A) + P(B)
P(A or B or C) = P(A) + P(B) + P(C)
24
Addition theorem when events are not Mutually
Exclusive (Overlapping or Intersection Events)
Definition: - It states that if 2 events A and B are not
mutually exclusive then the probability of the occurrence of
either A or B is the sum of the individual probability of A
and B minus the probability of occurrence of both A and B.
Symbolically




P(A or B) or P(A U B) = P(A) + P(B) P(A B)
25
Multiplication theorem
Definition: States that if 2 events A and B are
independent, then the probability of the occurrence of
both of them (A & B) is the product of the individual
probability of A and B.
Symbolically,
Probability of happening of both the events:


Theorem can be extended to 3 or more independent
events. Thus,


P(A and B) or P(A B) = P(A) x P(B)
P(A, B and C) or P(A B C) = P(A) x P(B) x P(C)
26
How to calculate probability in case of Dependent
Events
Case Formula
1. Probability of occurrence of at least A or B
1. When events are mutually

2. When events are not mutually exclusive

2. Probability of occurrence of both A & B

3. Probability of occurrence of A & not B

4. Probability of occurrence of B & not A

5. Probability of non-occurrence of both A & B

6. Probability of non-occurrence of atleast A or B


P(A U B) = P(A) + P(B)

P(A U B) = P(A) + P(B) P(A B)

P(A B) = P(A) + P(B) P(A U B)

P(A B) = P(A) - P(A B)

P(A B) = P(B) - P(A B)

P(A B) = 1 - P(A U B)


P(A U B) = 1 - P(A B)
27
How to calculate probability in case of Independent
Events
Case Formula
1. Probability of occurrence of both A & B

2. Probability of non-occurrence of both A &
B

3. Probability of occurrence of A & not B

4. Probability of occurrence of B & not A

5. Probability of occurrence of atleast one
event

6. Probability of non-occurrence of atleast
one event

7. Probability of occurrence of only one
event
P(A B) = P(A) x P(B)

P(A B) = P(A) x P(B)

P(A B) = P(A) x P(B)

P(A B) = P(A) x P(B)

P(A U B) = 1 - P(A B) = 1 [P(A) x P(B)]



P(A U B) = 1 - P(A B) = 1 [P(A) x P(B)]


P(A B) + P(A B) = [P(A) x P(B)] +

[P(A) x P(B)]
28
Problem
An inspector of the Alaska Pipeline has the task of
comparing the reliability of 2 pumping stations. Each
station is susceptible to 2 kinds of failure: Pump failure &
leakage. When either (or both) occur, the station must be
shut down. The data at hand indicate that the following
probabilities prevail:

Station P(Pump failure) P(Leakage) P(Both)
1 0.07 0.10 0
2 0.09 0.12 0.06
Which station has the higher probability of being shut
down.
29
Solution
P(Pump failure or Leakage)
= P(Pump Failure) + P(Leakage Failure)
P(Pump Failure Leakage Failure)

Station 1: 0.07 + 0.10 0 = 0.17

Station 2: 0.09 + 0.12 0.06 = 0.15

Thus, station 1 has the higher probability of being shut down.

30
31
Probabilities under conditions of Statistical
Independence
Statistically Independent Events: - The occurrence
of one event has no effect on the probability of the
occurrence of any other event.

Most managers who use probabilities are
concerned with 2 conditions.
1. The case where one event or another will occur.
2. The situation where 2 or more events will both occur.
32
There are 3 types of probabilities under statistical
independence.
Marginal
Joint
Conditional

Marginal/ Unconditional Probability: - A single probability
where only one event can take place.

Joint probability: - Probability of 2 or more events occurring
together or in succession.

Conditional probability: - Probability that a second event
(B) will occur if a first event (A) has already happened.


33
Example: Marginal Probability - Statistical Independence
A single probability where only one event can take place.




Example 1: - On each individual toss of an biased or unfair coin,
P(H) = 0.90 & P(T) = 0.10. The outcomes of several tosses of this
coin are statistically independent events too, even tough the coin
is biased.

Example 2: - 50 students of a school drew lottery to see which
student would get a free trip to the Carnival at Goa. Any one of
the students can calculate his/ her chances of winning as:
P(Winning) = 1/50 = 0.02
Marginal Probability of an Event
P(A) = P(A)
34
The probability of 2 or more independent events occurring
together or in succession is the product of their marginal
probabilities.



Example: - What is the probability of heads on 2 successive
tosses?
P(H1H2) = P(H1) * P(H2)
= 0.5 * 0.5 = 0.25
The probability of heads on 2 successive tosses is 0.25, since the
probability of any outcome is not affected by any preceding
outcome.
Joint Probability of 2 Independent Events
P(AB) = P(A) * P(B)
Example: Joint Probability - Statistical Independence
35
We can make the probabilities of events even more explicit using a Probabilistic
Tree.


1 Toss

2 Toss 3 Toss
H
1
0.5 H
1
H
2
0.25 H
1
H
2
H
3
0.125
T
1
0.5 H
1
T
2
0.25 H
1
H
2
T
3
0.125
T
1
H
2
0.25 H
1
T
2
H
3
0.125
T
1
T
2
0.25 H
1
T
2
T
3
0.125
T
1
H
2
H
3
0.125
T
1
H
2
T
3
0.125
T
1
T
2
H
3
0.125
T
1
T
2
T
3
0.125
36
For statistically independent events, conditional probability of
event B given that event A has occurred is simply the probability
of event B.



Example: - What is the probability that the second toss of a fair
coin will result in heads, given that heads resulted on the first
toss?
P(H2|H1) = 0.5
For 2 independent events, the result of the first toss have
absolutely no effect on the results of the second toss.
Example: Conditional Probability - Statistical Independence
Conditional Probability for 2 Independent Events
P(B|A) = P(B)
37
Probabilities under conditions of Statistical
Dependence
Statistical Dependence exists when the probability of
some event is dependent on or affected by the
occurrence of some other event.

The types of probabilities under statistical dependence
are:
Marginal
Joint
Conditional
38
Example
Assume that a box contains 10 balls distributed as follows: -

3 are colored & dotted
1 is colored & striped
2 are gray & dotted
4 are gray & striped

Event Probability of Event
1 0.1
Colored & Dotted
2 0.1
3 0.1
4 0.1 Colored & Striped
5 0.1
Gray & Dotted
6 0.1
7 0.1

Gray & Striped
8 0.1
9 0.1
10 0.1
39
It can be computed by summing up all the joint events in
which the simple event occurs.

Compute the marginal probability of the event colored.

It can be computed by summing up the probabilities of the
two joint events in which colored occurred:

P(C) = P(CD) + P(CS)
= 0.3 + 0.1
=0.4
Example: Marginal Probability - Statistically Dependent
40
Joint probabilities under conditions of statistical
dependence is given by



What is the probability that this ball is dotted and colored?
Probability of colored & dotted balls =
P(DC) = P(D|C) * P(D)
= (0.3/0.4) * 0.5
= 0.3 (Approximately)
Joint probability for Statistically Dependent Events
P(BA) = P(B|A) * P(A)
Example: Joint Probability - Statistically Dependent
41
Given A & B to be the 2 events then,




What is the probability that this ball is dotted, given that it
is colored?

The probability of drawing any one of the ball from this box is 0.1
(1/10) [Total no. of balls in the box = 10].

Conditional probability for Statistically Dependent Events
P(BA)
P(B|A) = ----------
P(A)
Example: Conditional Probability - Statistically Dependent
42
We know that there are 4 colored balls, 3 of which
are dotted & one of it striped.
P(DC) 0.3
P(D|C) = --------- = ------
P(C) 0.4
= 0.75

P(DC) = Probability of colored & dotted balls
(3 out of 10 --- 3/10)
P(C) = 4 out of 10 --- 4/10
43
Revising Prior Estimates of Probabilities: Bayes
Theorem
A very important & useful application of conditional
probability is the computation of unknown probabilities,
based on past data or information.

When an event occurs through one of the various mutually
disjoint events, then the conditional probability that this
event has occurred due to a particular reason or event is
termed as Inverse Probability or Posterior Probability.

Has wide ranging applications in Business & its
Management.

44
Since it is a concept of revision of probability based on
some additional information, it shows the improvement
towards certainty level of the event.

Example 1: - If a manager of a boutique finds that most of
the purple & white jackets that she thought would sell so
well are hanging on the rack, she must revise her prior
probabilities & order a different color combination or have
a sale.

Certain probabilities were altered after the people got
additional information. New probabilities are known as
revised, or Posterior probabilities.


45
Bayes Theorem
If an event A can occur only in conjunction with n mutually
exclusive & exhaustive events B
1
, B
2
, , B
n
, & if A actually
happens, then the probability that it was preceded by an event B
i
(for a conditional probabilities of A given B
1
, A given B
2
A given
B
n
are known) & if marginal probabilities P(B
i
) are also known,
then the posterior probability of event B
i
given that event A has
occurred is given by:

P(A | B
i
). P(B
i
)
P(B
i
| A) = ----------------------
P(A | B
i
). P(B
i
)
46
Remarks: -

The probabilities P(B
1
), P(B
2
), , P(B
n
) are termed as
the a priori probabilities because they exist before we
gain any information from the experiment itself.

The probabilities P(A | Bi), i=1,2,,n are called
Likelihoods because they indicate how likely the event
A under consideration is to occur, given each & every a
priori probability.

The probabilities P(Bi | A), i=1, 2, ,n are called
Posterior probabilities because they are determined
after the results of the experiment are known.
47
Problem
In a bolt factory machines A, B, & C manufacture
respectively 25%, 35%, & 40% of the total. Of their output
5%, 4%, 2% are defective bolts. A bolt is drawn at random
from the product & Is found to be defective.

What are the probabilities that it was manufactured by
machines A, B & C?
48
Solution
Let E1, E2, E3 denote the events manufactured by machines
A, B & C respectively.
Let E denote the event of its being defective.
P(E1) = 0.25; P(E2) = 0.35; P(E3) = 0.40;
Probability of drawing a defective bolt manufactured by
machine A is P(E|E1) = 0.05
Similarly P(E|E2) = 0.04; P(E|E3) = 0.02
Probability that defective bolt selected at random is
manufactured by machine A is given by

49
P(E1). P(E|E1)
P(E1|E) = ------------------------
P(E1). P(E|E1)

i=1 to 3
0.25*0.05
= ----------------------------------------------
0.25*0.05 + 0.35*0.04 + 0.40*0.02
= 25/69

Similarly P(E2|E) = 28/69
= [(0.35*0.04)/(.25*.05+.35*.04+.40*.02)]
P(E3|E) = 16/69
= [(0.40*0.02)/(.25*.05+.35*.04+.40*.02)]

50
Glossary of terms
Classical Probability: It is based on the idea that certain
occurrences are equally likely.
Example: - Numbers 1, 2, 3, 4, 5, & 6 on a fair die are each
equally likely to occur.
Conditional Probability: The probability that an event
occurs given the outcome of some other event.
Independent Events: Events are independent if the
occurrence of one event does not affect the occurrence of
another event.
Joint Probability: Is the likelihood that 2 or more events will
happen at the same time.
Multiplication Formula: If there are m ways of doing one
thing and n ways of doing another thing, there are m x n ways
of doing both.


51
Mutually exclusive events: A property of a set of categories
such that an individual, object, or measurement is included in
only one category.
Objective Probability: It is based on symmetry of games of
chance or similar situations.
Outcome: Observation or measurement of an experiment.
Posterior Probability: A revised probability based on
additional information.
Prior Probability: The initial probability based on the present
level of information.
Probability: A value between 0 and 1, inclusive, describing the
relative possibility (chance or likelihood) an event will occur.
Subjective Probability: Synonym for personal probability.
Involves personal judgment, information, intuition, & other
subjective evaluation criteria.
Example: - A physician assessing the probability of a patients
recovery is making a personal judgment based on what they
know and feel about the situation.

Vous aimerez peut-être aussi