Académique Documents
Professionnel Documents
Culture Documents
information
Modeling information
sources
Information content of a
source
Fundamental limits on
representation of
information
?
2
time
voltage
voltage
V4
V4
V3
V3
Symbol period
time
V2
V2
V1
V1
a) 4-level Symbol for logic 00
Symbol period
time
Examples of Symbols
00
01
10
11
Symbol:
Smallest unit of data transmitted at one time
Symbol rate or baud rate: symbols per second
00
01
10
11
Data
Receiver
Information Sources
An info source produces time-varying, random
o/p
Properties of its random output depend on nature
of source (e.g. its BW, amplitude, statistics etc.)
Mathematical model required to measure
information content of source
Simplest model is the Discrete Memoryless
Source (DMS): Discrete-time, discrete-amplitude
random process in which all sets of (discrete)
random outputs are generated independently
and with the same probability distribution
DMS Example
An information source is described by the
alphabet A = {0,1} and probabilities: p(X i = 1) =
1 P(Xi = 0) = p.
This discrete memory-less source:
Is a binary source as it generates sequences of
1s & 0s
Produces, for each regular time interval, a value
of either 1 or 0 with probability p1 = p; p0 = 1-p
In the special case in which p = 0.5, the source is
called a binary symmetric source, or BSS
14
Impact of probability
perturbations?
15
Information Content of an
Output
Modeling Information
Content
17
Binary Entropy
Consider a binary memory-less source:
Let p = probability of 0 occurring
And 1 p = probability of 1 occurring
The Binary entropy function is:
H ( X ) p log 2 p (1 p ) log 2 (1 p )
Observe that H(X)= 0, if p = 0 or p = 1
19
H(X)= 0, if
p = 0 or
p = 1
H ( X ) p log 2 p (1 p ) log 2 (1 p )
20
i 1
i 1
N
H max
log 2 N
N
H max log 2 N bits / symbol
Example Equi-probable
Outputs
Consider a keypad comprising only A and B keys which are struck
with equal probability. The keypad generates messages of various
lengths:
keys/msg
Possible messages
AA
AB
BA
BB
AAA
AAB
ABA
ABB
BAA
BAB
BBA
BBB
22
Outputs of Unequal
Probabilities
( y)
Beyond Binary
Previous example used only 2 symbols (0
and 1)
Generally, there can be a max of N
symbols
H p ( y ) log p p ( y ) log p p ( y ) log p .... p ( y ) log p
Thus,
( y ) p log p bits
1
i 1
And
H
av
( y ) pi log 2 pi
i 1
( y)
25
Redundancy
When entropy is less than max, the message is
said to contain redundancy
26
Example 1
Consider two possible events one of
which occurs with probability p1 =
0.8.
Determine the information content of
each individual event
What does this result imply?
Determine the average information
Example 2
Determine the average information
of a message that is made up of two
likely symbols, given that the
probability of occurrence of each is
as follows:
p = 1; q = 0
What does this result indicate?
Example 3
A system can send out a group of four
pulses, each of width 1ms. and with
equal probability of having an amplitude
of 0, 1, 2 or 3V. The four pulses are
always followed by a pulse of amplitude
-1V to separate the groups.
Sketch a typical pulse sequence
What is the average rate of information
generated by this system?
Example 4
The probabilities of Example 3 are
altered such that the 0V level occurs
one-half of the time on average, the
1V level occurs one-quarter of the time
on average and the remaining levels
occur one-eighth of the time each.
Find the average rate of generation of
information
Determine the redundancy
Limits re Info
Representation
An information source requires, on average, a minimum of
H(X) bits per source output for error free representation
Information
Source
Summary
A DMS is a simple model for an information
source
The info content or entropy of a source ...
Is a measure of its randomness
(uncertainty)
Is the weighted sum of the self information of
all possible source o/ps
Is expressed in bits / source symbol or bits /
sample
Is maximized when symbol probabilities are
equal
Is less than max when message contains
redundancy
Is ... for binary
Reading
Mandatory reading:
Proakis, J, Masoud Salehi. Fundamentals
of Communication Systems : Sections
1.2, 2.1.2 (pp.25 and 26 only), Intro to
Ch. 12, 12.1-12.1.2
Recommended reading
Proakis, J, Masoud Salehi. Fundamentals
of Communication Systems : Sections
5.1-5.3
33