Vous êtes sur la page 1sur 24

INFORMATION THEORY

Claude Elwood Shannon (1916


2001), American electrical engineer
and mathematician, has been called
the father of information theory,
and was the founder of practical
digital circuit design theory.

A Mathematical Theory of Communication


(1948)

INTRODUCTION

Information
Shannon

in

theory

was

the

late

introduced
1940

as

by
a

mathematical theory.

It is a branch of applied mathematics,


electrical engineering & computer science.

Information theory is concerned with the


analysis of communication systems.

Applications of Information Theory

Efficiency and Capacity of computers


and similar devices.
Video abstraction extraction
Secure Data Transmission
Channel Capacity
Speech Coding
Waveform Coding
Lossless data compression
Cryptography

Contd.
Information theory is concerned with the
two
fundamental
limits
of
communication.
What is the ultimate limit to data
compression? e.g. how many bits are
required to represent a music source.
ANS. Entropy

Contd.

What is the ultimate limit of reliable


communication over a noisy channel,
OR
Which is the largest possible data
transfer rate for given resources and
conditions? e.g. how many bits can be
sent in one second over a telephone
line. ANS. Channel capacity

Contd.
Key concepts for Information Theory are
entropy (H(X)) and mutual
information (I (X;Y)).
X, Y are random variables of some kind.
Up to the 40s, it was common wisdom
in telecommunications that the error
rate increased with increasing data rate.
Claude Shannon demonstrated that error
free transmission may be possible under
certain conditions.

Contd.

Information Theory provides strict


bounds for any communication system.
Maximum data compression
minimum I(X;X)
Maximum data transfer rate
maximum I(X;Y).

Contd.

Any given communication system works


between said limits.
The mathematics behind is not always
constructive, but provides guidelines to
design algorithms to improve
communications given a set of available
resources.
The resources in this context are known
parameters such us available transmission
power, available bandwidth, signal-tonoise ratio and the like.

Information Age

Internet is part of our life: packets carry


information.
Cellular phones: we can't think of life
without them.
DVD, Flash, Hard Disk: We can carry
tremendous amounts of information.
Email: changed the nature of business
and personal communication
Instant Messaging: presence and
collaboration

Contd.

Information warfare, Psychological


operations.
Network centric warfare, situational
awareness.
Information is a loaded word means
many different things in different
contexts.

Why Information
Theory?
Despite technological advances
bandwidth remains a precious resource
in many situations.
Time-Space trade-off makes sense in
many situations.
Information Security has become
everyone's
requirement due to ubiquitous
communication
technology.

Mathematical
Background

Sets
Functions
Probability Theory
Information Theory
Complexity Theory
Number Theory
Abstract Algebra
Finite Fields

Sets

Set cannot be defined precisely in a


mathematical way. It is a concept which
we call all intuitively understand and
agree upon.
Set is a well-defined collection of
distinct objects.
Some common sets include:
, , ,,

Contd.

Well-defined means that given a set S


and any particular a, either a S or a
S but not both.
Uppercase letters denote sets. For
example A, B, C, ...
Lowercase letters denote elements of
sets. For example a,b, c ...

Set Cardinality
The number of elements in a set is called its
cardinality
For a set A its cardinality is denoted as |A|
Finite Sets have finite number of elements
Infinite sets have infinite number of
elements
Countably infinite sets have infinite number
of elements but between any two given
elements of the set there are a finite
number of other set elements.

Discrete vs. Continuous

Discrete Sets have the property of


countability.
Set of integers is discrete. We can count
the number of integers in any interval
in Z
Set of real numbers R is continuous. In
sub-interval of R we cannot count the
number of real numbers. It is infinite.

Basic Definitions

Symbol: Symbols are objects,


characters, or other concrete
representations of ideas, concepts, or
other abstractions.
For example, in the United States,
Canada, Australia and Great Britain, a
red octagon is a symbol for the traffic
sign meaning "STOP".

Contd.

Alphabet: An alphabet is a set of


symbols.

Word: A word is a finite sized


aggregation of
symbols from the alphabet.

Contd.

Model of a Digital
Communication System

Examples of information
source:

Speech
Image
Video
Text file
Music

Example of channels

Airwaves (EM radiation)


Cable
Telephone line
Hard disk
CD
DVD
Flash memory device
Optical path ,Internet

Example of information
receiver

TV screen
Audio system and listener
Computer file
Image printer and viewer

Vous aimerez peut-être aussi