Vous êtes sur la page 1sur 4

AWGN channel

COMMUNICATION SYSTEMS A channel which adds Gaussian distributed white noise


to the signal is called Additive White Gaussian Noise
Lecture # 17 (AWGN) channel.
The term additive means that the noise is simply
4th Apr 2007
superimposed or added to the signal.

Instructor This channel affects each transmitted symbol


WASEEM KHAN independently.
Such a channel is called memoryless channel.

Centre for Advanced Studies in Engineering

Effect of Noise Bit-energy E b


Noise when added to the information signal, may cause Let in a binary digital communication system, a symbol is
errors in the received information bits. defined as
Bit-error rate (BER) is the basic criteria to check the s0 (t ) A cos t 0 t Tb
performance of a communication system. Bit energy can be calculated as
Tb
Usually BER is plotted A2 sin( 2 Tb )
Eb [ A cos t ]2 dt Tb
against Eb/N0. 0
2 2
Eb is the bit energy
A2Tb (since Tb >>sin(2 Tb) / 2 )
(Eb = S . Tb) Eb
where S = signal power
2
and Tb = bit duration 2 Eb
or A
Tb

E b/N0 Demodulation and Detection of Digital


N0 is power spectral density, hence we can define N0 = N/ W, Signals AWGN

where N is total noise power while W is the bandwidth.


DETECT
Eb is bit energy which can be defined as Eb = STb or DEMODULATE & SAMPLE
SAMPLE
at t = T
Tb RECEIVED
FREQUENCY
WAVEFORM RECEIVING EQUALIZING

s 2 ( t ) dt
DOWN
Eb TRANSMITTED
WAVEFORM
CONVERSION
FILTER FILTER THRESHOLD
COMPARISON
MESSAGE
SYMBOL
OR
CHANNEL
0 FOR COMPENSATION
FOR CHANNEL
SYMBOL
BANDPASS
SIGNALS INDUCED ISI

Now we can define Eb/N0 as


Eb STb S / Rb S W
N0 N /W N /W N Rb The digital receiver performs two basic functions:
Eb/N0 is dimensionless and usually expressed in decibels (dB) Demodulation, to recover a waveform to be sampled at t = nT.
Eb Eb Detection, decision-making process of selecting possible
( dB ) 10 log
N0 N0 digital symbol

1
Detection of Binary Signal in Gaussian Detection of Binary Signal in Gaussian
Noise 2
Noise
1
For any binary channel, the transmitted signal over a
Original 0

signal -1
symbol interval (0,T) is:
-2
0 2 4 6 8 10 12 14 16 18 20 s0 (t ) 0 t T for a binary 0
2 si (t )
1 s1 (t ) 0 t T for a binary 1
Noise 0

-1

-2
0 2 4 6 8 10 12 14 16 18 20
If the channel is AWGN, the received symbol will be
2
Noisy
signal
1

0
r(t) si (t ) n(t) i 0,1 0 t T
-1

-2
0 2 4 6 8 10 12 14 16 18 20

Detection of Binary Signal in Gaussian Noise Detection of Binary Signal in Gaussian Noise
The recovery of signal at the receiver consists of two parts:
Waveform-to-sample transformation
Demodulator followed by a sampler
Each symbol is sampled at t = T to get a sample z(T).

The recovery of signal at the receiver consist of two parts


z(T ) ai (T ) n0 (T ) i 0,1
Filter
where ai(T) is the desired signal component,
Reduces the received signal to a single variable z(T)
and n0(T) is the noise component
z(T) is called the test statistics
Detection of symbol
Detector (or decision circuit)
Assume that input noise is a Gaussian random process, i.e.
Compares the z(T) to some threshold level 0 , i.e., 2
H 1 1 1 n0
p (n0 ) exp
z (T ) where H1 and H0 are the two 0 2 2 0
0 possible binary hypothesis
H 0

Detection of Binary Signal in Gaussian Noise Probabilities Review


The sample z(T) will be another Gaussian random variable.
2
1 1 z a0 P[s0], P[s1] a priori probabilities
p( z | s0 ) exp
0 2
2 0 These probabilities are known before transmission
2 P[z]
1 1 z a1
p( z | s1 ) exp probability of the received sample
0 2
2 0
p(z|s0), p(z|s1)
conditional pdf of received signal z, conditioned on the
transmitted symbol si
P[s0|z], P[s1|z] a posteriori probabilities

2
Choosing the Threshold Choosing the Threshold
Maximum a posteriori (MAP) criterion: p ( z | s1 )
H1
P (s0 )
L(z) likelihood ratio test ( LRT )
If p ( s0 | z ) p ( s1 | z ) H0 p( z | s0 ) H0
P ( s1 )

When the two signals, s0(t) and s1(t), are equally likely, i.e., P(s0) =
If p ( s1 | z ) p ( s0 | z ) H1 P(s1) = 0.5, then the decision rule becomes
H1
p ( z | s1 )
Problem is that a posteriori probability are not known. L( z) 1 max likelihood ratio test
Solution: Use Bay s theorem: p ( z | s0 ) H0

p( z | s ) p(s ) This is known as maximum likelihood ratio test because we are


p (s | z) i i
i p(z) selecting the hypothesis that corresponds to the signal with the
maximum likelihood.
H1 H1
p( z | s1 ) P(s1 ) p( z | s0 ) P(s0 ) In terms of the Bayes criterion, it implies that the cost of both types
p( z | s1) P(s1) p( z | s0 ) P(s0 )
P( z) P( z) of error is the same.
H0 H0

Announcements

Class on 6th April (6:00~7:30 pm) in lieu of 7th


April
Second sessional on 13th April (Friday)
6:00pm to 7:30pm

3
This document was created with Win2PDF available at http://www.daneprairie.com.
The unregistered version of Win2PDF is for evaluation or non-commercial use only.

Vous aimerez peut-être aussi