Vous êtes sur la page 1sur 27

EN 53001 Communication Theory

3. Random processes and noise

3.1 Probability

3.1 Probability - Review

Definitions Axioms Joint and Conditional Probability Total Probability Bayes’ theorem Independent Events Combined Experiments Bernoulli Trials

Probability Introduced through Sets and

Relative Frequency

Probability is a measure of the expectation that an event will occur or a statement is true. Probabilities are given a value

between 0 (will not occur) and 1 (will occur). The higher the

probability of an event, the more certain we are that the event

will occur.”

Probability is defined in two ways:

  • 1. Set theory and fundamental axioms

  • 2. Relative frequency, based on common sense and engineering/scientific observations

From a physical experiment, a mathematical model is developed. A single experiment is called a trial for which there is an outcome.

The simplified approach to a precise mathematical procedure in defining an experiment is to use reason

and examples

Example 3.1-1: tossing a die and observing the number that shows up

if the die is unbiased, the likelihood of any number occurring is 1/6 (this is called probability of the outcome).

The set of all possible outcomes in any experiment is called the sample space S.

There are three components in forming the mathematical

model of random experiments:

  • 1. Sample space (discrete and continuous): Set of all possible outcomes of a trial

  • 2. Events: An event is defined as a subset of the sample space. A sample space (set) with N elements can have as many as 2 N events (subsets)

  • 3. Probability: Probability is a nonnegative number assigned to each event on a sample space, it is a function of the event defined i.e., P(A) or P[A] is the probability of event A.

Axiom 1:

P(A) 0

Axiom 2: P(S) = 1, therefore S is a certain event, and null set is an event with no element known as impossible event with probability 0.

Axiom 3: For N events A n , n = 1,2,….,N where N could be infinite, defined on a sample space S, and

A m A n = for all m n,

P(A n ) = P(A n )

The probability of an event that is the union of any number of mutually exclusive events is equal to the sum of the individual event probabilities.

Mathematical Model of Random Experiments

The mathematical model of experiments can be formed in the following three steps:

  • 1. Assignment of a sample space

  • 2. Definition of events of interest

  • 3. Making probability assignments to the events such as to satisfy the axioms

Example 3.1-2: An experiment consists of observing the sum of the numbers showing up when two dice are thrown. Develop a model for this experiment.

Example 3.1-2: An experiment consists of observing the sum of the numbers showing up when two
Example 3.1-2: An experiment consists of observing the sum of the numbers showing up when two

Figure 3.1-1: Sample space applicable to Example 3.1-2.

Example 3.1-2

Continued

Example 3.1-2 Continued
Example 3.1-2 Continued

Probability as a Relative Frequency

The second way of forming the probability is by using the relative frequency of occurrence of some events by means of common sense and engineering/scientific observations.

For example in flipping a fair coin, if head (H) shows up n H times out of n flipping, the limit

limn

H

/ n

P(H )

n 

can be interpreted as the probability of the even H. If the coin is “fair” this limit is ½, or P(H)=1/2.

Joint and Conditional Probability

In some experiments, some events are not mutually exclusive because of

common elements (See Example 3.0.1)

For two events A and B, the common elements form the event A B. The probability P(A B) is called the joint probability for two events A and B which intersect in the sample space.

It is shown by Venn diagram that:

P(A B) = P(A) + P(B) - P(A B) P(A) + P(B) or P(A B) = P(A) + P(B) P(A B) The probability of the union of two events never exceeds the sum of the event

probabilities.

The equality only holds for mutually exclusive events because A B = and therefore

P(A B)= P()=0.

For an event B which has nonzero probability, the conditional probability of an event A, given B is:

P(A|B)= P(A B)/ P(B) For mutually exclusive events A and B,

A B = .

Therefore P(A|B) = 0.

Total Probability

Suppose we are given N mutually exclusive events B n , n=1,2,…,N and whose union equals S.

B m B n = mn=1,2,…,N

N

B

n

n 1

S

Total probability of event A,defined on S is

P A

N

P A

n 1

|

B

n

P B

n

is proven using Figure 3.0-2

Figure 3.1-2 Venn diagram of N mutually exclusive events B and another event A .

Figure 3.1-2 Venn diagram of N mutually exclusive events B n and another event A.

Bayes’ theorem

Bayes’ theorem is given in equations (1.4-15) and (1.4-16) in your text. See Examples 1.4-2, 1.4-3, and 1.4-4.

If P(A)≠0

P B

|

A

P B

n

A

If P(B)≠0

n

P A

|

 

B

PA

P A

B

n

n

P B

n

Equating above two, we get one form of Bayes’ Theorem

P B

|

A

P A

|

B

n

P B

n

 

n

 

PA

 

Another form is obtained using the concept of total

probability.

P B

n

|

A

P A

|

B

n

P B

n

N

P A | B

n

n 1

P B

n

Example 3.1-3: An elementary binary communication system consists of a transmitter that sends one of two possible symbols (a 1 or a 0) over a

channel to a receiver. The channel occasionally causes errors so that a 1

shows up at the receiver as a 0 and vice versa.

We denote the events B 1 =“the symbol before the channel is 1”, B 2 =”the

symbol before the channel is 0” .

The “apriori probabilities are given as

Example 3.1-3: An elementary binary communication system consists of a transmitter that sends one of two

The events A 1 =“ the symbol after the channel is 1” and A 2 -”the symbol after the channel is 0”.

Conditional probabilities describing the effect of channel are given as

Example 3.1-3: An elementary binary communication system consists of a transmitter that sends one of two
Example 3.1-3: An elementary binary communication system consists of a transmitter that sends one of two

Find P(A 1 ) and P(A 2 ) the total probabilities of receiving a 1 and 0. Also find the aposteriori probabilites” P(B 1 |A 1 ), P(B 1 |A 2 ), P(B 2 |A 1 ) and P(B 2 |A 2 ) .

Figure 3.1-3 Binary symmetric communication system Diagrammatical model. The last two numbers are probabilities of system

Figure 3.1-3 Binary symmetric communication system Diagrammatical model.

Figure 3.1-3 Binary symmetric communication system Diagrammatical model. The last two numbers are probabilities of system
Figure 3.1-3 Binary symmetric communication system Diagrammatical model. The last two numbers are probabilities of system

The last two numbers are probabilities of system error while P(B 1 |A 1 ) and P(B 2 |A 2 ) are probabilities of correct system transmission of symbols.

Independent Events

Two events A and B are statistically independent if the probability of occurrence of one event is not

affected by the occurrence of the other event. This

statement can be written as:

P(A|B) = P(A) or P(B|A) = P(B), also means that the probability of the joint occurrence of two

events must equal the product of the two event probabilities:

P(AB) = P(A) P(B)

This condition is used as a test of independence. It is the necessary and sufficient condition for A and B to be independent.

For mutually exclusive events, their joint

probability is P(AB) = 0, therefore two events can not be both mutually exclusive and

statistically independence. In order for the

two events to be independent, they must have

an intersection, AB Ø.

When more than two events involved, say three events A, B and C, the following four equations should satisfied to be considered as

independence:

  • 1. P(A B) = P(A) P(B)

  • 2. P(B C) = P(B) P(C)

  • 3. P(A C) = P(A) P(C)

  • 4. P(A B C) = P(A) P(B) P(C)

Combined Experiments

A combined experiment consists of forming a single experiment by combining individual experiments, which are called sub- experiments. In this regards, we will have

combined sample

space,

events

on

the

combined space, and probabilities

Figure 3.1-4 A combined sample space for two subexperiments.

Figure 3.1-4 A combined sample space for two subexperiments.

Permutations and Combinations

Permutation is introduced when experiments involve multiple trials when outcomes are elements of a finite sample space and they are not replaced after each trial. There are n P r = n!/(n-r)! permutations for r objects selected from a total of n objects with order.

If the order of elements in a sequence is not important, the possible sequences of r elements taken from n elements without replacement are called combination. There are n C r = n!/[(n-r)!r!] combination in selecting r objects from a total of n.

Bernoulli Trials

Any experiment for which there are only two possible outcomes in any trial i.e., pass or fail, true or false, 1 or 0, and is done repeatedly is called Bernoulli trials.

Bernoulli Trials

Example 3.1-4: (Pulse Code Modulation) PCM Repeater Error Probability

In PCM, regenerative repeaters are used to detect pulses corrupted by noise

and retransmit new, clean pulses. 1 st link 2 nd link k th link n th
and retransmit new, clean pulses.
1
st
link
2
nd
link
k th link
n th link
In
Out

If P e is the probability of error in detecting a pulse over one link, show that P E , the probability of error in detecting a pulse over the entire channel (n

links in tandem), is P E nP e

Solution:

nP e << 1

Probability of detecting a pulse correctly over one link = (1- P e ) Probability of detecting a pulse correctly over entire channel (n links)= (1- P E )

A pulse can be detected correctly over the entire channel if either the pulse is detected correctly over every link or errors are made over an even number of links only.

Bernoulli Trials

PCM Repeater Error Probability …

Probability of correctly detecting a pulse over entire channel:

1- P E = P(correct detection over all links) + P(erroneous detection over two links only) + P(erroneous detection over four links only)+…

where

n

  2  

+ P(erroneous detection over

2

links only)

a

denotes the largest integer less than or equal to a.

1

P

E

1

 

P

e

n

k 2, 4, 6, ...

n !

k

!

n

k

!

k

P

e

1

P

e

n

k

;

P

e



1

1

P

e

n

n !

2!

n

2 !

2

P

e

1

 

P

e

n

n n 1

2

2

P e
P
e

1

P

e

n

if

nP

e



1

1

 

nP

e

since

P

e



1

and

P

E

nP

e

(Intuitive explanation ? )