Académique Documents
Professionnel Documents
Culture Documents
Hamid Jafarkhani
Introduction
Information
Source
Source Encoder
Encryption
Channel Encoder
Modulator
Noise
Information Sink
Source Decoder
Decryption
Channel Decoder
Demodulator
For Storage Mod./Demod. is only write and read One may not use Encryption and/or source coding Increase demand for efficient and reliable data transmission How to recover the transmitted/stored data in a noisy environment
Shannon introduced a metric by which information can be quantified the minimum possible number of symbols necessary for the error-free representation of a source A longer message containing the same information is said to have redundant symbols. Source Coding: To remove uncontrolled redundancy that naturally exists in a source. Encryption: To modify the information such that it cannot be understood by anyone except the intended recipient. Channel Coding: To increase the immunity of the transmitted information to noise. Our focus is on Channel Coding Memoryless Channel: If the output of the channel depends only on the transmitted signal in that interval, and not on any previous transmission, the channel is memoryless u
Digital Source Encoder
noise
Coding Channel
Digital Sink
Decoder
r
3
Theorem Shannons noisy channel coding theorem: With every channel we can associate a channel capacity C. There exists error control codes such that information can be transmitted across the channel at rates less than C with arbitrary low bit error rate. - A simplified model for transmitting binary-phase-shift-keyed (BPSK) modulation on an additive white Gaussian noise (AWGN) channel. Eb = energy of the signal S (t ) =
2 Eb T
N0 =
R=S+N
Eb Eb
s = No
2
N S + R
- hard-limiting the output of a discrete channel This results in a binary symmetric channel (BSC) 0 P(010)1-p 0 BER=p=Q( Q(x) = 2? 1 P(111)=1-p 1
1
x 2Eb No
OO
e /2 dy
-y2
Example: A simple repetition code For 0 we transmit 0,0,0 For 1 we transmit 1,1,1
distance = 3
- At the receiver there are eight possibilities: 0,0,0 0,0,1 0,1,0 0,1,1 1,0,0 0 0 0 1 0
1,0,1 1
1,1,0 1
1,1,1 1
0,1,1
1st bit
. closer to 1,1,1
1 or 1
need to transmit 3 bits instead of one bit. So, the rate of the code is
R=1/3
1 0 [V1,V2,V3,] 0 1 = [0 0] 1 1
V.HT =O
p=0.1
> BER=0.028
Repetition code, rate R=1/n, transmit n zeroes for 0 and n ones for 1. The two
code words differ at n positions. It can correct up to Example n=7 0 1 0,0,0,0,0,0,0 1,1,1,1,1,1,1
n1 errors. 2
Using majority logic, the code can correct up to 3 errors. Example - Parity check code For k = 6 input bits, the encoder adds one bit to ensure that there are an even number of 1s in each code word. (u0,u1,u2,u3,u4,u5) = (1,0,1,1,0,0,) U = (1,0,0,0,0,1) V = (1,0,1,1,0,0,1)
V = (1,0,0,0,0,1,0)
1 1 1 1 (v1,v2,v3,v4,v5,v6,v7) 1 = 0 1 1 1
Example Hamming code k=4, n=7 R=4/7
V=( V0,V1,V2,U0,U1,U2,U3 )
V0
U0
V1
V0
U0
V1
U1 U3 U2 V2
1 0 0 V. 1 1 0 1 0 0 1 0 0 1 1 0 =[0 0 0] 0 1 1 1 1 1
U3 U2 U1 V2
1 0 0 V. 1 0 1 1 0 0 1 0 0 1 1 0 =[0 0 0] 1 1 1 1 0 1
H is not unique
1 0 1 0 0 0 1 1 0 1 0 0 = Pkx( nk ) I k 1 1 0 0 1 0 0 1 0 0 0 1
- Code words are different at least at 3 postions. Therefore, if one bit changes due to an error, the resulting vector is still closer to the original codeword. - 3 is called the minimum distance - The distance between two n-tuples is the number of places where they differ, e.g., d((1,1,0,1,0,0,0),(0,1,1,0,1,0,0))=4 because they differ at zeroth, second, third and fourth places. - The minimum distance of a code is the minimum of the distance of all possible pairs of code words. dmin c2 tc = [
d min-1 2
tc 1
tc
c1
n1 2
8