Académique Documents
Professionnel Documents
Culture Documents
INTRODUCTION
Information
Shannon
in
theory
was
the
late
introduced
1940
as
by
a
mathematical theory.
Contd.
Information theory is concerned with the
two
fundamental
limits
of
communication.
What is the ultimate limit to data
compression? e.g. how many bits are
required to represent a music source.
ANS. Entropy
Contd.
Contd.
Key concepts for Information Theory are
entropy (H(X)) and mutual
information (I (X;Y)).
X, Y are random variables of some kind.
Up to the 40s, it was common wisdom
in telecommunications that the error
rate increased with increasing data rate.
Claude Shannon demonstrated that error
free transmission may be possible under
certain conditions.
Contd.
Contd.
Information Age
Contd.
Why Information
Theory?
Despite technological advances
bandwidth remains a precious resource
in many situations.
Time-Space trade-off makes sense in
many situations.
Information Security has become
everyone's
requirement due to ubiquitous
communication
technology.
Mathematical
Background
Sets
Functions
Probability Theory
Information Theory
Complexity Theory
Number Theory
Abstract Algebra
Finite Fields
Sets
Contd.
Set Cardinality
The number of elements in a set is called its
cardinality
For a set A its cardinality is denoted as |A|
Finite Sets have finite number of elements
Infinite sets have infinite number of
elements
Countably infinite sets have infinite number
of elements but between any two given
elements of the set there are a finite
number of other set elements.
Basic Definitions
Contd.
Contd.
Model of a Digital
Communication System
Examples of information
source:
Speech
Image
Video
Text file
Music
Example of channels
Example of information
receiver
TV screen
Audio system and listener
Computer file
Image printer and viewer