Vous êtes sur la page 1sur 29

EE 422: LECTURE NOTE 1

Communication Systems II
Outline:

1. Syllabus

2. Topics covered

3. Review of Probability and random processes

EE422 Spring 2008, # 1 1


EE 422: LECTURE NOTE 1

• Instructor: Dr. Yao Ma, Coover 3107,


Phone: 515-294-8382, Email: mayao@iastate.edu.

• Course Homepage: http://clue.eng.iastate.edu/∼mayao/EE422/EE422.html

• Class Time and Location: TU/TR 4:10-5:30 pm, Location TBA.

• Instructor Office hours: TU/TR 2-3 pm

• Prerequisite:
− EE 322 Probabilistic Methods for Electrical Engineers, and
− Enrollment in EE423 Lab.

• Text book: B. P. Lathi, Modern digital and Analog Communication


Systems, 3rd Edition, Oxford University Press, 1998.

EE422 Spring 2008, # 1 2


• Additional reading:
1. Andrea Goldsmith, Wireless Communications, 1st edition, Cambridge
University Press, 2005.
2. R. E. Ziemer and W. H. Tranter, Principles of Communications:
systems modulation and noise, 5th Edition, John Wiley & Sons, 2002.
3. John G. Proakis, Digital Communications, 4th Edition, McGraw–Hill.
4. P. M. Shanker, Introduction to Wireless Systems, John Wiley & Sons,
2001.

• Grading: (tentative)
Homework + project 30%
Quizzes 20%
Midterm 20%
Final Exam 30%

EE422 Spring 2008, # 1 3


TOPICS COVERED

• Introduction to probability and random processes;


• Performance of analog systems with noise;
• Performance of digital communication with noise: optimum receivers,
transmission impairments, and error rates;
• Optimum signal detection;
• Introduction to information theory and coding: source coding, channel
coding, capacity for wireless channels.
• Fading channel models.
• Diversity and adaptive modulation.

EE422 Spring 2008, # 1 4


COURSE OUTLINE

1. Basic Concepts of Probability. Random variables and its statistics.


Random processes, power spectral density, linear systems, and optimal
filtering.
2. Behavior of analog systems in noise. Baseband systems. Analog
modulations: amplitude, angle, and pulse modulation.
3. Behavior of digital communication systems in noise. Modulation formats:
ASK, FSK, PSK, and M -ary modulation. Spread spectrum.
4. Signal detection. Signal space and signal sets. Performance criterion.
Optimal receiver. Colored noise.
5. Information theory: source encoding, channel capacity for discrete
memoryless channel and continuous channel.

EE422 Spring 2008, # 1 5


6. Error correction coding: linear block codes, convolutional codes.
7. Statistical wireless channel model.
8. Capacity for wireless channels.
9. Diversity and multiple antennas.

EE422 Spring 2008, # 1 6


COURSE OBJECTIVES

1. Grasp the useful concepts of transmission and detection of digital


and analog communication systems in noise, and grasp the necessary
mathematical tools to understand their behavior.
• Probability, random processes, and statistics.
• error performance of analog and digital communication in noise
• Optimal signal detection, elements of information theory and coding.

2. Model some communication and signal processing applications in


mathematical language, and solve related problems.

3. Use Matlab to simulate the performance of a communication system,


e.g. multiuser detection for CDMA.

EE422 Spring 2008, # 1 7


INTRODUCTION TO PROBABILITY
OUTLINE

• Concept of Probability
− Conditional Probability and Independence
− Bernoulli Trials

• Random Variables
− Continuous and Discrete Random Variables
− PDF and CDF
− Some Classical Distributions

Reading: B. P. Lathi 10.1 - 10.2

EE422 Spring 2008, # 1 8


CONCEPT OF PROBABILITY

• Theory of probability deals with random phenomena and processes.


The probability can be approximated by P (A) ≈ nA/n, which is a
relative frequency.

• Observation, deduction and prediction.


Example:
Observation. A dice is rolled for 36 times, each number {1, 2, . . . , 6}
appeared approximately 6 times.
Deduction: the chance that number 1 appears is 1/6.
Prediction: If we throw the dice 30 times, number 1 will appear about 5
times.

EE422 Spring 2008, # 1 9


METHODS FOR DEFINING A PROBABILITY
Relative frequency definition

• P (A) is defined by

nA
P (A) = lim (1)
n→∞ n

where nA is the number of occurrences of A and n is the number of


trials.
Example: In a ten years period, the number of raining days in our town
is 400. Thus the average chance of raining is 400/3652 ≃ 11%.

• Criticisms: this definition is hard to be used for measurement in reality;


and not convenient to be used in deductive analysis.

EE422 Spring 2008, # 1 10


AXIOMATIC DEFINITION.

• Union A + B is the event that A or B or both occur;

• Intersection AB is the event that both A and B occur;

• Events A and B are mutually exclusive if the occurrence of one of them


excludes the occurrence of the other.

• The probability P (A) ≥ 0 for any event A;

• P (C) = 1 for the certain event C.

• P (A + B) = P (A) + P (B) if events A and B are mutually exclusive.

⇒Axiomatic definition is the most useful definition for practical use.

EE422 Spring 2008, # 1 11


CONDITIONAL PROBABILITY

Definition: P (A|B) = P (AB)/P (B), where P (B) 6= 0.

Example: In a fair-die experiment, if an even number appears, what is the


chance that it is number 2 ?.
Solution: Let A = {f2}, and M = {even} = {f2, f4, f6},

⇒P (A) = 1/6 and P (M ) = 1/2.

The chance is given by P (A|M ) = P (AM )/P (M ) = P (A)/P (M ) = 1/3.

EE422 Spring 2008, # 1 12


INDEPENDENCE

• Two events A and B are called independent if P (AB) = P (A)P (B).


• If A and B are independent, then
⇒P (B|A) = P (B), P (A|B) = P (A).

Example: Using P (AB) = P (A)P (B), we can show that P (ĀB) =


P (Ā)P (B).

Proof: B = AB ∪ ĀB, and AB and ĀB are mutually exclusive, thus


P (B) = P (AB) + P (ĀB). P (ĀB) = P (B) − P (AB) = P (B) −
P (A)P (B) = [1 − P (A)]P (B) = P (Ā)P (B).

EE422 Spring 2008, # 1 13


INDEPENDENCE (CONT’D)
Example: Three parallel independent switches A1, A2 and A3. Denote
the probability that Ai is working by P (Ai) = pi, i = 1, 2, 3.

Let R denote the event that the signal is through to the output.

A1
• Q1) what is the probability
P (R)? x(t ) A2 y(t )
• Q2) Given event R, what is the
probability that A1 is working?
A3

EE422 Spring 2008, # 1 14


INDEPENDENCE (CONT’D)
Solution:

1. P (R) = 1 − P (Ā1Ā2Ā3) = 1 − P (Ā1)P (Ā2)P (Ā3)


= 1 − (1 − p1)(1 − p2)(1 − p3)
= p1 p2 p3 + p1 + p2 + p3 − p 1 p2 − p 1 p3 − p 2 p3 .

2. P (A1|R) = P (RA1)/P (R) = P (R|A1)P (A1)/P (R).


Since P (R|A1) = 1, so
P (A1|R) = P (A1)/P (R) = p1/[p1p2p3 +p1 +p2 +p3 −p1p2 −p1p3 −p2p3].

EE422 Spring 2008, # 1 15


BERNOULLI TRIALS
If we have a succession of independent trials for event A to happen or not
happen. P (A) = p, and P (Ā) = 1 − p. Then
µ ¶
n k
Pr{ k successes in n trials } = p (1 − p)n−k ,
k
¡n ¢ n!
where k = k!(n−k)! , which is the binomial coefficient representing the
number of combinations of choosing k items out of n. pk (1 − p)n−k shows
one possible outcome that k successes and n − k failures.
Example: for a bit string of n bits, the error probability for detecting every
bit is independent and identical, denoted by Pe.
¡n¢ k
⇒Pr{k errors in n bits } = k Pe (1 P − Pe)n−k ;¢
n
Pr{more than k errors in n bits } = l=k+1 nl Pel(1 − Pe)n−l
¡

EE422 Spring 2008, # 1 16


BERNOULLI TRIALS (CONT’D)
Example 10.6

A binary symmetric channel (BSC) has an error probability Pe.


P (0|1) = P (1|0) = Pe, and P (0|0) = P (1|1) = 1 − Pe, where P (y|x) is
the conditional probability of receiving y when x is transmitted.

Find the chance of two-bit errors in 8 received bit, with Pe = 10−5.


Solution:
µ ¶
8 2 8−2 8! −10
Pr{2 errors in 8 bits } = Pe (1 − Pe) ≃ 10 = 2.8 · 10−9.
2 2!6!

EE422 Spring 2008, # 1 17


BERNOULLI TRIALS (CONT’D)
Example 10.8 : In a binary communication channel, Pe is the error
probability of one digit. If we transmit three digits 000 (for message 0)
and 111 (for message 1), which is length-3 repetition coding. What is the
probability of a wrong decision P (ǫ)?
Solution:
Message 0 is decoded wrongly if two of more digits of 000 are decoded
wrongly.P
3 ¡3¢ k
P (ǫ) = k=2 k Pe (1 − Pe)3−k = 3Pe2(1 − Pe) + Pe3 = 3Pe2 − 2Pe3.

If Pe ≪ 1, then P (ǫ) ≃ 3Pe2.

If Pe = 10−3, then P (ǫ) ≃ 3 × 10−6.

This coding method has an efficiency of 1/3, and so not very efficient.

EE422 Spring 2008, # 1 18


TOTAL PROBABILITY AND BAYES’ THEOREM

If U = [A1, . . . , An] is a partition of the whole set S (i.e., P(S) =1) and B
is an arbitrary event, then P (B) = P (B|A1)P (A1) + · · · + P (B|An)P (An).
Proof: P (B) = P (BS) = P [B(A1 ∪ A2 ∪ · · · ∪ An)],

⇒P (B) = P (BA1) + · · · + P (BAn)


= P (B|A1)P (A1) + · · · + P (B|An)P (An).

This result says that P (B), probability of B, may be evaluated by combining


B with another partition S. In some cases this makes the evaluation of
P (B) easier.

EE422 Spring 2008, # 1 19


TOTAL PROBABILITY AND BAYES’ THEOREM (2)
Bayes’ Theorem

From the definition P (Ai|B) = P (AiB)/P (B) = P (B|Ai)P (Ai)/P (B),


we obtain

P (BAi) P (B|Ai)P (Ai) P (B|Ai)P (Ai)


P (Ai|B) = = = n
P
P (B) P (B) i=1 P (B|Ai )P (Ai )

Pn
Note that the bottom of the last term, i=1 P (B|Ai )P (Ai ), is equal to
P (B), called total probability theorem.

EE422 Spring 2008, # 1 20


EXAMPLE

In a binary channel, the conditional error probabilities are P (0|1) = Pe,1,


and P (1|0) = Pe,2. The probability of transmitting digits 0 and 1 are
respectively, P0 and P1.

1. Find the average error probability P̄e

2. Find the probability that “0” was sent given that ’1’ was detected.

Solution:

(1) Using total probability theorem

P̄e = P0P (1|0) + P1P (0|1) = P0Pe,1 + P1Pe,2.

EE422 Spring 2008, # 1 21


EXAMPLE (CONT’D)

(2) Using Bayes’ theorem,

Pr{0 was sent and 1 was detected}


Pr{0 was sent|1 was detected} =
Pr{ 1 was detected}
P (1|0)P (0)
=
P (1|0)P (0) + P (1|1)P (1)
Pe,2P0
=
Pe,2P0 + P1(1 − Pe,1)

Note that P (1|1) = 1 − P (0|1) and P (1|0) = 1 − P (0|0).

EE422 Spring 2008, # 1 22


CONCEPT OF RANDOM VARIABLES

• Definition of a random variable (RV): an RV x is a process of assigning


a number x(ξ) to every outcome ξ.

Example:
Define an RV X(fi) = 10i of the die experiment, where i is the number of
the die which shows up. Thus X(f1) = 10, X(f2) = 20, . . . , X(f6) = 60.

The set {X ≤ 35} consists of elements f1, f2, f3 only. The set {15 ≤ X ≤
60} consists of f2, · · · , f6.

EE422 Spring 2008, # 1 23


CUMULATIVE DISTRIBUTION FUNCTION

1. Definition: the cumulative distribution function (CDF) of the RV X:


FX (x) = P {X ≤ x}, where x ∈ (−∞, ∞).
Example: In the fair die experiment, define an RV X such that
x(fi) = i, the cdf of X is then a staircase function

 ⌊x⌋/6 0≤x<6
P (X ≤ x) = 0 x<0
1 x≥6

For example, F (4.5) = P (f1, f2, f3, f4) = 4/6, and F (10) =
P (f1, · · · , f6) = 1.

EE422 Spring 2008, # 1 24


PROPERTIES OF THE CDF

1. F (∞) = 1 and F (−∞) = 0.


Because F (∞) = P (X ≤ ∞) = P (S) = 1,
F (−∞) = P (X = −∞) = 0.

2. If x1 < x2, then F (x1) ≤ F (x2).


Because P {X ≤ x2} = P {X ≤ x1} + P {x1 < X ≤ x2}.

3. P (x1 < X ≤ x2) = F (x2) − F (x1).


Because P (x1 < X ≤ x2) = P (X ≤ x2) − F (X ≤ x1).

4. CCDF: P (X > x) = 1 − F (x). Because P (X ≤ x) + P (X > x) = 1.

EE422 Spring 2008, # 1 25


PROBABILITY DENSITY FUNCTION (PDF)
Definition: the derivative of the CDF is called the probability density
function (PDF), as shown by

d F (x) P (x ≤ X ≤ x + ∆x)
f (x) = = lim
dx ∆x→0 ∆x

For a discrete type of RV, the derivative of the CDF is called the PMF
(probability mass function).
P
f (x) = i piδ(x − xi), where pi = P (X = xi), and δ(x) is the impulse
function.
Example: In a fair die experiment let x(fi) = i, where i = 1, . . . , 6,
then f (x) = 1/6[δ(x − 1) + δ(x − 2) + · · · + δ(x − 6)].
δ(x) is the Kronecker Delta function.

EE422 Spring 2008, # 1 26


SOME DISTRIBUTION FUNCTIONS
Normal (Gaussian) Distribution

• Normal distribution is popular since circuit noise, radio channel noise,


fading channel gain, and many natural phenomena follow this distribution.
The normal distribution can be denoted as N (η, σ 2).
• The PDF for a Gaussian RV X with mean η and variance σ 2 is given by
(denoted as X ∼ N (η, σ 2))
2 2
fx(x) = √1 e−(x−η) /(2σ ) .
σ 2π
• To make an easy tabulation of values, we may define a standard variable
Y = (X − η)/σ, and Y ∼ N (0, 1).
2
The pdf of y is then g(y) = √1 e−y /2 .

fx(x) = σ1 g((x − η)/σ).

EE422 Spring 2008, # 1 27


GAUSSIAN DISTRIBUTION (CONT’D)
The CDF is the integral of PDF of X from −∞ to a certain specified
value. R
x1
F (x1) = −∞ f (x)dx.
2
Gaussian CDF of X, X ∼ N (η, σ ).
F (x) = G( x−η
R x
σ ), where G(x) = −∞ g(t)dt.
c
R∞
Complementary CDF (CCDF) : F (x) = 1 − F (x) = x f (t)dt.
The CCDF of Gaussian variable X, g(x), is the Gaussian-Q function:

1
Z ∞ Z ∞
−x2 /2
Q(y) = g(x)dx = √ e dx
y 2π y

Q-function is widely used for error performance evaluation of


communication.

EE422 Spring 2008, # 1 28


GAUSSIAN DISTRIBUTION (CONT’D)

Example: An RV X has a Gaussian distribution N (100, 100), what is the


probability that x ∈ [90, 120]?
Solution: η = 100, and σ = 10, thus

P (90 ≤ x ≤ 120) = F (120) − F (90) = G(20/10) − G(−10/10)


= G(2) − G(−1) = G(1) + G(2) − 1 = 0.819

EE422 Spring 2008, # 1 29

Vous aimerez peut-être aussi