Académique Documents
Professionnel Documents
Culture Documents
Introduction
Communication network
Elements of NIT
2 / 118
Introduction
Communication network
Elements of NIT
3 / 118
Introduction
Elements of NIT
4 / 118
Introduction
Brief History
First paper: Shannon (1961) Two-way communication channels
Significant research activities in 70s and early 80s with many new results and
techniques, but
Some progress on old open problems and many new models and problems
Elements of NIT
5 / 118
Introduction
Elements of NIT
6 / 118
Introduction
Book Organization
Part I. Preliminaries (Chapters 2,3): Review of basic information measures,
typicality, Shannons theorems. Introduction of key lemmas
Part II. Single-hop networks (Chapters 4 to 14): Networks with single-round,
one-way communication
Part III. Multihop networks (Chapters 15 to 20): Networks with relaying and
multiple communication rounds
Elements of NIT
7 / 118
Introduction
Tutorial Objectives
Focus on elementary and unified approach to coding schemes
Elements of NIT
8 / 118
Introduction
Outline
1. Typical Sequences
2. Point-to-Point Communication
3. Multiple Access Channel
4. Broadcast Channel
5. Lossy Source Coding
10-minute break
6. WynerZiv Coding
7. GelfandPinsker Coding
8. Wiretap Channel
10-minute break
9. Relay Channel
10. Multicast Network
El Gamal & Kim (Stanford & UCSD)
Elements of NIT
9 / 118
Typical Sequences
Typical Sequences
Empirical pmf (or type) of x n X n :
{i: x = x}
i
(x|x n ) =
n
for x X
1 n
(xi ) (1 + ) E((X))
n i=1
Elements of NIT
10 / 118
Typical Sequences
P{X n T(n) } 1
typical x n
|T(n) | 2nH(X)
p(x n ) 2nH(X)
Elements of NIT
11 / 118
Typical Sequences
i
i
(x, y|x n , y n ) =
for (x, y) X Y
n
Elements of NIT
12 / 118
Typical Sequences
If x n T(n)
(X) and > , then for n sufficiently large,
|T(n) (Y |x n )| 2n(H(Y|X)())
iff
yi = (xi ), i [1 : n]
Elements of NIT
13 / 118
Typical Sequences
T(n) (Y)
T(n) (X, Y)
| | 2nH(X,Y)
| | 2nH(Y)
T(n) (Y|x n )
El Gamal & Kim (Stanford & UCSD)
T(n) (X|y n )
Elements of NIT
14 / 118
Typical Sequences
Xn
T(n) (X)
T(n) (Y)
Yn
11
00
00
11
00
11
00
11
xn
T(n) (Y|x n )
| | 2nH(Y|X)
Elements of NIT
15 / 118
Typical Sequences
Elements of NIT
16 / 118
Typical Sequences
Summary
1. Typical Sequences
2. Point-to-Point Communication
4. Broadcast Channel
5. Lossy Source Coding
6. WynerZiv Coding
7. GelfandPinsker Coding
8. Wiretap Channel
9. Relay Channel
10. Multicast Network
El Gamal & Kim (Stanford & UCSD)
Elements of NIT
17 / 118
Point-to-Point Communication
Encoder
Xn
Yn
p(y|x)
Decoder
Discrete: Finite-alphabet
Memoryless: When used over n transmissions with message M and input X n ,
p(yi |x i , y i1 , m) = pY|X (yi |xi )
Elements of NIT
18 / 118
Point-to-Point Communication
Encoder
Xn
Yn
p(y|x)
Decoder
b(xi (m)) nB
i=1
R achievable if (2nR , n) codes that satisfy the cost constraint with lim Pe(n) = 0
n
Capacitycost function C(B) of the DMC p(y|x) with average cost constraint B
on X is the supremum of all achievable rates
max
p(x):E(b(X))B
Elements of NIT
I(X; Y)
Tutorial, ISIT 2011
19 / 118
Point-to-Point Communication
Proof of Achievability
We use random coding and joint typicality decoding
Codebook generation:
Encoding:
Decoding:
Elements of NIT
20 / 118
Point-to-Point Communication
P(E1 ) + P(E2 )
Elements of NIT
21 / 118
Point-to-Point Communication
p X (xi )
x T() i=1
(x , y )T() i=1
Elements of NIT
22 / 118
Point-to-Point Communication
Xn
T(n) (Y)
X n (2)
X n (m)
Yn
Elements of NIT
23 / 118
Point-to-Point Communication
Packing Lemma
Let (U , X, Y) p(u, x, y)
Packing Lemma
There exists () 0 as 0 such that
lim P(U n , X n (m), Y n ) T(n) for some m A} = 0,
Elements of NIT
24 / 118
Point-to-Point Communication
Elements of NIT
25 / 118
Point-to-Point Communication
Gaussian Channel
Gaussian Channel
Discrete-time additive white Gaussian noise channel
Z
Y = X + Z
max
F(x):E(X 2 )P
I(X; Y) =
Elements of NIT
1
log(1 + S)
2
26 / 118
Point-to-Point Communication
Gaussian Channel
Proof of Achievability
We extend the proof for DMC using a discretization procedure (McEliece 1977)
First note that the capacity is attained by X N(0, P), i.e., I(X; Y) = C
Let [X] j be a finite quantization of X such that E([X]2j ) E(X 2 ) = P and
[X] j X in distribution
Z
[X] j
[Y j ]k
Yj
Elements of NIT
27 / 118
Point-to-Point Communication
Gaussian Channel
Summary
1. Typical Sequences
2. Point-to-Point Communication
Random coding
4. Broadcast Channel
Packing lemma
6. WynerZiv Coding
7. GelfandPinsker Coding
8. Wiretap Channel
9. Relay Channel
10. Multicast Network
El Gamal & Kim (Stanford & UCSD)
Elements of NIT
28 / 118
M2
Encoder 1
Encoder 2
X1n
X2n
p(y|x1 , x2 )
Yn
Decoder
1, M
2
M
Assume (M1 , M2 ) Unif([1 : 2nR1 ] [1 : 2nR2 ]): x1n (M1 ) and x2n (M2 ) independent
1, M
2 ) = (M1 , M2 )}
Average probability of error: P (n) = P{(M
e
Elements of NIT
29 / 118
Capacity region of DM-MAC p(y|x1 , x2 ) is the set of rate pairs (R1 , R2 ) such that
R1 I(X1 ; Y | X2 , Q),
R2 I(X2 ; Y | X1 , Q),
R1 + R2 I(X1 , X2 ; Y |Q)
for some pmf p(q)p(x1 |q)p(x2 |q), where Q is an auxiliary (time-sharing) r.v.
R2
C12
Individual capacities:
C1 = max p(x1 ), x2 I(X1 ; Y|X2 = x2 )
C2 = max p(x2 ), x1 I(X2 ; Y|X1 = x1 )
C2
Sum-capacity:
C12 = max p(x1 )p(x2 ) I(X1 , X2 ; Y)
C1 C12
El Gamal & Kim (Stanford & UCSD)
R1
Elements of NIT
30 / 118
Similarly generate 2nR2 sequences x2n (m2 ) i=1 pX2 |Q (x2i |qi ), m2 [1 : 2nR2 ]
n
Encoding:
Decoding:
1, m
2 ) such that (qn , x1n (m
1 ), x2n (m
2 ), y n ) T(n)
Find the unique message pair (m
Elements of NIT
31 / 118
m2
1
1
Joint pmf
p(q )p(x1n |qn )p(x2n |qn )p(y n |x1n , x2n , qn )
p(qn )p(x1n |qn )p(x2n |qn )p(y n |x2n , qn )
p(qn )p(x1n |qn )p(x2n |qn )p(y n |x1n , qn )
p(qn )p(x1n |qn )p(x2n |qn )p(y n |qn )
n
Elements of NIT
32 / 118
m1
1
m2
1
1
Joint pmf
p(q )p(x1n |qn )p(x2n |qn )p(y n |x1n , x2n , qn )
p(qn )p(x1n |qn )p(x2n |qn )p(y n |x2n , qn )
p(qn )p(x1n |qn )p(x2n |qn )p(y n |x1n , qn )
p(qn )p(x1n |qn )p(x2n |qn )p(y n |qn )
n
Elements of NIT
33 / 118
Packing Lemma
Let (U , X, Y) p(u, x, y)
Packing Lemma
There exists () 0 as 0 such that
lim P(U n , X n (m), Y n ) T(n) for some m A} = 0,
Elements of NIT
34 / 118
m1
1
m2
1
1
Joint pmf
p(q )p(x1n |qn )p(x2n |qn )p(y n |x1n , x2n , qn )
p(qn )p(x1n |qn )p(x2n |qn )p(y n |x2n , qn )
p(qn )p(x1n |qn )p(x2n |qn )p(y n |x1n , qn )
p(qn )p(x1n |qn )p(x2n |qn )p(y n |qn )
n
Elements of NIT
35 / 118
m1
1
m2
1
1
Joint pmf
p(q )p(x1n |qn )p(x2n |qn )p(y n |x1n , x2n , qn )
p(qn )p(x1n |qn )p(x2n |qn )p(y n |x2n , qn )
p(qn )p(x1n |qn )p(x2n |qn )p(y n |x1n , qn )
p(qn )p(x1n |qn )p(x2n |qn )p(y n |qn )
n
Remark: (X1n (m1 ), X2n (m2 )), m1 = 1, m2 = 1, are not mutually independent but
each of them is pairwise independent of Y n (given Q n )
Elements of NIT
36 / 118
Summary
1. Typical Sequences
2. Point-to-Point Communication
3. Multiple Access Channel
4. Broadcast Channel
Simultaneous decoding
6. WynerZiv Coding
7. GelfandPinsker Coding
8. Wiretap Channel
9. Relay Channel
10. Multicast Network
El Gamal & Kim (Stanford & UCSD)
Elements of NIT
37 / 118
Broadcast Channel
Encoder
Xn
p(y1 , y2 |x)
Y2n
1
M
Decoder 1
2
M
Decoder 2
Elements of NIT
38 / 118
Broadcast Channel
R2 < I(U ; Y2 ),
R1 + R2 < I(X; Y1 )
Elements of NIT
39 / 118
Broadcast Channel
Proof of Achievability
We use superposition coding and simultaneous nonunique decoding
Codebook generation:
Fix p(u)p(x|u)
Randomly and independently generate 2nR2 sequences (cloud centers)
n
un (m2 ) i=1 pU (ui ), m2 [1 : 2nR2 ]
un (m2 )
Elements of NIT
Xn
x n (m1 , m2 )
40 / 118
Broadcast Channel
Proof of Achievability
We use superposition coding and simultaneous nonunique decoding
Codebook generation:
Fix p(u)p(x|u)
Randomly and independently generate 2nR2 sequences (cloud centers)
n
un (m2 ) i=1 pU (ui ), m2 [1 : 2nR2 ]
Encoding:
Decoding:
Elements of NIT
for some m2
Tutorial, ISIT 2011
40 / 118
Broadcast Channel
m2
1
Joint pmf
p(u , x n )p(y1n |x n )
n
p(un , x n )p(y1n )
p(un , x n )p(y1n )
Elements of NIT
41 / 118
Broadcast Channel
m1
m2
Joint pmf
p(un , x n )p(y1n |x n )
p(un , x n )p(y1n )
p(un , x n )p(y1n )
Elements of NIT
42 / 118
Broadcast Channel
Summary
1. Typical Sequences
2. Point-to-Point Communication
3. Multiple Access Channel
4. Broadcast Channel
5. Lossy Source Coding
Superposition coding
Simultaneous nonunique decoding
6. WynerZiv Coding
7. GelfandPinsker Coding
8. Wiretap Channel
9. Relay Channel
10. Multicast Network
El Gamal & Kim (Stanford & UCSD)
Elements of NIT
43 / 118
Encoder
Decoder
( X n , D)
1 n
d(xi , xi )
n i=1
Elements of NIT
44 / 118
Xn
Encoder
Decoder
( X n , D)
min
p(x|x):E(d(x, x))D
I(X; X)
Elements of NIT
45 / 118
Proof of Achievability
We use random coding and joint typicality encoding
Codebook generation:
Encoding:
Decoding:
Elements of NIT
46 / 118
Xn
T(n)
(X)
X n (1)
Xn
X n (m)
Elements of NIT
47 / 118
Covering Lemma
p(u, x, x) and <
Let (U , X, X)
Covering Lemma
There exists () 0 as 0 such that
lim P(U n , X n , X n (m)) T(n) for all m A = 0,
) + ()
if R > I(X; X|U
Elements of NIT
48 / 118
Now, by the law of total expectation and the typical average lemma,
Ed(X n , X n ) = P(E) Ed(X n , X n )|E + P(E c ) Ed(X n , X n )|E c
Elements of NIT
49 / 118
Elements of NIT
50 / 118
1 n
lim P{ X n = X n }
P{ X i = Xi } n
n i=1
Proof of R R(0):
+ () = R(0) + ()
Then P(E) = P{(X n , X n ) T(n) } 0 as n if R > I(X; X)
Elements of NIT
51 / 118
Summary
1. Typical Sequences
2. Point-to-Point Communication
3. Multiple Access Channel
4. Broadcast Channel
5. Lossy Source Coding
6. WynerZiv Coding
Covering lemma
7. GelfandPinsker Coding
8. Wiretap Channel
9. Relay Channel
10. Multicast Network
El Gamal & Kim (Stanford & UCSD)
Elements of NIT
52 / 118
WynerZiv Coding
Encoder
Decoder
( X n , D)
Yn
A (2nR , n) lossy source code with side information available at the decoder:
Encoder: m(x n )
Decoder: xn (m, y n )
D
where the minimum is over all p(u|x) and x(u, y) such that E(d(X, X))
El Gamal & Kim (Stanford & UCSD)
Elements of NIT
53 / 118
WynerZiv Coding
Proof of Achievability
We use binning in addition to joint typicality encoding and decoding
yn
x
un (1)
B(1)
B(m)
T(n) (U , Y)
un (l)
B(2nR )
un (2nR )
Elements of NIT
54 / 118
WynerZiv Coding
Proof of Achievability
We use binning in addition to joint typicality encoding and decoding
Codebook generation:
Randomly and independently generate 2nR sequences un (l) i=1 pU (ui ), l [1 : 2nR ]
Encoding:
Decoding:
Upon receiving m, find the unique l B(m) such that (un ( l), y n ) T(n) , where >
Compute the reconstruction sequence as x = x(u ( l), y ), i [1 : n]
Elements of NIT
54 / 118
WynerZiv Coding
and consider
E1 = (U n (l), X n ) T(n)
for all l [1 : 2nR ]
E2 = (U n (L), X n , Y n ) T(n)
E3 = (U n ( l), Y n ) T(n) for some l B(M), l =
L
Elements of NIT
55 / 118
WynerZiv Coding
E2 = (U n (L), X n , Y n ) T(n)
E3 = (U n ( l), Y n ) T(n) for some l B(M), l =
L
i=1
i=1
E2 ) 0 as n
Elements of NIT
56 / 118
WynerZiv Coding
RSI-D
= H(X |Y)
RSI-D
RSI-D (0) since (1/n) i=1 P{ X i = Xi } P{ X n = X n }
n
RSI-D
RSI-D (0) by WynerZiv coding with X = U = X
Elements of NIT
57 / 118
WynerZiv Coding
Summary
1. Typical Sequences
2. Point-to-Point Communication
3. Multiple Access Channel
4. Broadcast Channel
5. Lossy Source Coding
6. WynerZiv Coding
7. GelfandPinsker Coding
Binning
Application of conditional typicality
lemma
8. Wiretap Channel
9. Relay Channel
Elements of NIT
58 / 118
GelfandPinsker Coding
Encoder
Xn
p(s)
p(y|x, s)
Yn
Decoder
A (2nR , n) code for the DMC with state information available at the encoder:
Elements of NIT
59 / 118
GelfandPinsker Coding
Sn
Encoder
p(s)
Xn
p(y|x, s)
Yn
Decoder
max
p(u|s), x(u,s):E(b(X))B
Elements of NIT
I(U ; Y) I(U ; S)
60 / 118
GelfandPinsker Coding
T(n) (U , S)
C(m)
un (l)
C(2nR )
un (2nR )
Elements of NIT
61 / 118
GelfandPinsker Coding
n
2n(RR) randomly and independently generated sequences un (l) i=1 pU (ui ),
n(RR)
n(RR)
l [(m 1)2
+ 1 : m2
]
Encoding:
To send m [1 : 2nR ] given s n , find un (l) C(m) such that (un (l), s n ) T(n)
Decoding:
Elements of NIT
61 / 118
GelfandPinsker Coding
E2 = (U n (L), Y n ) T(n)
Elements of NIT
62 / 118
GelfandPinsker Coding
E1 = (U n (l), S n ) T(n)
for all U n (l) C(1)
E2 = (U n (L), Y n ) T(n)
i=1
i=1
Elements of NIT
63 / 118
GelfandPinsker Coding
Binning
Elements of NIT
64 / 118
GelfandPinsker Coding
min
binning
covering rate packing rate
max
multicoding
packing rate covering rate
Elements of NIT
65 / 118
GelfandPinsker Coding
Sn
M
Z
Yn
Xn
Encoder
Decoder
Encoder
Noise Z N(0, N)
State S N(0, Q), independent of Z
Assume expected average power constraint: ni=1 E(xi2 (m, S n )) nP for every m
C=
1
2
log 1 +
CSI-ED =
1
2
N+Q
log 1 +
= CSI-D
P
1
log 1 +
2
N
Elements of NIT
66 / 118
GelfandPinsker Coding
Proof of Achievability
Proof involves a clever choice of F(u|s), x(u, s) and discretization procedure
Let X N(0, P) independent of S and U = X + S, where = P/(P + N). Then
I(U ; Y) I(U ; S) =
P
1
log 1 +
2
N
j j k
El Gamal & Kim (Stanford & UCSD)
Elements of NIT
67 / 118
GelfandPinsker Coding
Summary
1. Typical Sequences
2. Point-to-Point Communication
3. Multiple Access Channel
4. Broadcast Channel
5. Lossy Source Coding
6. WynerZiv Coding
7. GelfandPinsker Coding
Multicoding
8. Wiretap Channel
9. Relay Channel
Elements of NIT
68 / 118
Wiretap Channel
Encoder
Xn
Decoder
p(y, z|x)
Zn
Eavesdropper
Elements of NIT
69 / 118
Wiretap Channel
Yn
M
Encoder
Decoder
p(y, z|x)
Zn
Eavesdropper
= M}
Average probability of error: Pe(n) = P{M
(R, RL ) achievable if (2nR , n) codes with limn Pe(n) = 0, lim supn RL(n) RL
Rateleakage region R : closure of the set of achievable (R, RL )
Secrecy capacity: CS = max{R: (R, 0) R }
Elements of NIT
70 / 118
Wiretap Channel
Proof of Achievability
We use multicoding and two-step randomized encoding
Codebook generation:
Assume CS > 0 and fix p(u, x) that attains it (I(U ; Y) I(U ; Z) > 0)
For each m [1 : 2nR ], generate a subcodebook C(m) consisting of
n
2n(RR) randomly and independently generated sequences un (l) i=1 pU (ui ),
n(RR)
n(RR)
l [(m 1)2
+ 1 : m2
]
C(1)
C(2)
C(2nR )
C(3)
Encoding:
2n R
2n(RR)
l :1
Decoding:
Elements of NIT
71 / 118
Wiretap Channel
2n R
2n(RR)
l :1
C(2)
C(1)
C(2nR )
C(3)
If R R > I(U ; Z), the eavesdropper has roughly same number of sequences in
each subcodebook, providing it with no information about the message
Let M be the message sent and L be the randomly selected index
Every codebook
i=1
Elements of NIT
72 / 118
Wiretap Channel
= nR I(U n ; Z n )
= nR nI(U ; Z)
(a)
Elements of NIT
73 / 118
Wiretap Channel
Lemma
1
n
Substituting (recall that R < I(U ; Y) () for decoding), we have shown that
lim sup I(M; Z n |C) ()
n
1
n
Thus, there must exist a sequence of (2nR , n) codes such that Pe(n) 0 and
RL(n) () as n
El Gamal & Kim (Stanford & UCSD)
Elements of NIT
74 / 118
Wiretap Channel
Summary
1. Typical Sequences
2. Point-to-Point Communication
3. Multiple Access Channel
4. Broadcast Channel
5. Lossy Source Coding
6. WynerZiv Coding
7. GelfandPinsker Coding
8. Wiretap Channel
9. Relay Channel
Randomized encoding
Bound on equivocation (list size)
Elements of NIT
75 / 118
Relay Channel
Encoder
X1n
X2n
p(y2 , y3 |x1 , x2 )
Y3n
Decoder
Elements of NIT
76 / 118
Relay Channel
Relay encoder
Y2n
M
Encoder
X1n
X2n
p(y2 , y3 |x1 , x2 )
Y3n
Decoder
Elements of NIT
77 / 118
Relay Channel
Multihop
M
Y3
X1
Tight for a cascade of two DMCs, i.e., p(y2 , y3 |x1 , x2 ) = p(y2 |x1 )p(y3 |x2 ):
C = minmax I(X2 ; Y3 ), max I(X1 ; Y2 )
p(x2 )
p(x1 )
The scheme uses block Markov coding, where codewords in a block can depend
on the message sent in the previous block
El Gamal & Kim (Stanford & UCSD)
Elements of NIT
78 / 118
Relay Channel
Multihop
Proof of Achievability
Send b 1 messages in b blocks using independently generated codebooks
n
m1
m2
m3
mb1
Block 1
b1
Codebook generation:
Encoding:
Relay encoding:
Decoding:
Elements of NIT
79 / 118
Relay Channel
Multihop
j = 1} {M
j = 1} {M
j = M
j }, the decoder makes an error only if one
Since {M
of the following events occur:
j1 ), Y n ( j)) T (n)
E1 ( j) = (X1n (1), X2n (M
2
j ), Y n ( j + 1)) T (n)
E1 ( j) = (X2n (M
3
j
E2 ( j) = (X2n (m j ), Y3n ( j + 1)) T(n) for some m j = M
Elements of NIT
80 / 118
Relay Channel
Multihop
j1 ), Y n ( j)) T (n)
E1 ( j) = (X1n (1), X2n (M
2
j ), Y n ( j + 1)) T (n)
E1 ( j) = (X2n (M
3
j
E2 ( j) = (X2n (m j ), Y3n ( j + 1)) T(n) for some m j = M
j1 , which is a function of Y n ( j 1)
By the independence of the codebooks, M
2
j1 ) in C j
and codebook C j1 , is independent of the codewords X1n (1), X2n (M
Thus by the LLN, P(E1 ( j)) 0 as n
By the packing lemma, P(E2 ( j)) 0 as n if R < I(X1 ; Y2 |X2 ) ()
By the independence of the codebooks and the LLN, P(E1 ( j)) 0 as n
Elements of NIT
81 / 118
Relay Channel
Coherent Multihop
Y3
X1
Hence, the multihop coding scheme can be improved via coherent cooperation
between the sender and the relay
max
p(x1 ,x2 )
minI(X2 ; Y3 ), I(X1 ; Y2 | X2 )
Elements of NIT
82 / 118
Relay Channel
Coherent Multihop
Proof of Achievability
We again use a block Markov coding scheme
Codebook generation:
Elements of NIT
83 / 118
Relay Channel
Coherent Multihop
Block
...
b1
X1
...
x1n (1|mb1 )
Y2
1
m
2
m
3
m
...
b1
m
X2
x2n (1)
1)
x2n (m
2)
x2n (m
...
b2 )
x2n (m
b1 )
x2n (m
Y3
1
m
2
m
...
b2
m
b1
m
Encoding:
Relay encoding:
j such that
At the end of block j, find the unique m
j |m
j1 ), x2n (m
j1 ), y2n ( j)) T(n)
(x1n (m
j ) from codebook C j+1
In block j + 1, transmit x2n (m
Decoding:
Elements of NIT
84 / 118
Relay Channel
Coherent Multihop
j ), Y n ( j + 1)) T (n)
E1 ( j) = (X2n (M
3
j
E2 ( j) = (X2n (m j ), Y3n ( j + 1)) T(n) for some m j = M
Elements of NIT
85 / 118
Relay Channel
Coherent Multihop
j1 ), X n (M
j1 ), Y n ( j)) T (n) for some m j = 1
E2 ( j) = (X1n (m j | M
2
Then
j)) P(E(
j 1))+ P(E1 ( j) Ec ( j 1))+ P(E2 ( j))
P(E(
j1 = 1},
P{(X1n (1|1), X2n (1), Y2n ( j)) T(n) | M
Elements of NIT
86 / 118
Relay Channel
DecodeForward
Y3
X1
Elements of NIT
87 / 118
Relay Channel
DecodeForward
...
b1
X1
...
x1n (1|mb1 )
Y2
1
m
2
m
3
m
...
b1
m
X2
x2n (1)
1)
x2n (m
2)
x2n (m
...
b2 )
x2n (m
b1 )
x2n (m
Y3
1
m
2
m
...
b2
m
b1
m
Decoding:
Decoding at the receiver is done backwards after all b blocks are received
j such that
For j = b 1, . . . , 1, the receiver finds the unique message m
j+1 |m
j ), x2n (m
j ), y3n ( j + 1)) T(n) , successively with the initial condition m
b = 1
(x1n (m
Elements of NIT
88 / 118
Relay Channel
DecodeForward
j+1 = 1}
E( j + 1) = {M
j+1 | M
j ), X n (M
j ), Y n ( j + 1)) T (n)
E1 ( j) = (X1n (M
2
3
Elements of NIT
89 / 118
Relay Channel
DecodeForward
j) = {M
j = 1}
E(
j+1 = 1}
E( j + 1) = {M
j+1 | M
j ), X n (M
j ), Y n ( j + 1)) T (n)
E1 ( j) = (X1n (M
2
3
j+1 = 1, M
j = 1
= P(X1n (1|1), X2n (1), Y3n ( j + 1)) T(n) , M
j = 1,
P(X1n (1|1), X2n (1), Y3n ( j + 1)) T(n) | M
Elements of NIT
90 / 118
Relay Channel
CompressForward
Y2
Y3
X1
If channel from sender to relay is worse than direct channel to receiver, this
requirement can reduce rate below that of direct transmission (relay is not used)
In the compressforward coding scheme, the relay helps communication by
sending a description of its received sequence to the receiver
max
Elements of NIT
91 / 118
Relay Channel
CompressForward
Proof of Achievability
We use block Markov coding, joint typicality encoding, binning, and
simultaneous nonunique decoding
Y2 : X2
Y2
Y3
X1
At the end of block j, the relay chooses a reconstruction sequence y2n ( j) of the
received sequence y2n ( j)
Since the receiver has side information y3n ( j), we use binning to reduce the rate
The bin index is sent to the receiver in block j + 1 via x2n ( j + 1)
At the end of block j + 1, the receiver recovers the bin index and then m j and the
compression index simultaneously
Elements of NIT
92 / 118
Relay Channel
CompressForward
Proof of Achievability
We use block Markov coding, joint typicality encoding, binning, and
simultaneous nonunique decoding
Y2 : X2
Y2
Y3
X1
Codebook generation:
Fix p(x1 )p(x2 )p( y2 |y2 , x2 ) that attains the lower bound
For j [1 : b], randomly and independently generate 2nR sequences
n
x1n (m j ) i=1 pX1 (x1i ), m j [1 : 2nR ]
n
sequences y2n (k j |l j1 ) i=1 pY2 |X2 ( y2i |x2i (l j1 )), k j [1 : 2nR2 ]
Elements of NIT
93 / 118
Relay Channel
CompressForward
Block
...
b1
X1
x1n (m1 )
x1n (m2 )
x1n (m3 )
...
x1n (mb1 )
x1n (1)
Y2
...
X2
x2n (1)
x2n (l1 )
x2n (l2 )
...
x2n (lb2 )
x2n (lb1 )
Y3
l , k , m
1 1 1
l , k , m
2 2 2
...
l , k , m
b2 b2 b2
l , k , m
b1 b1 b1
Encoding:
Relay encoding:
At the end of block j, find an index k j such that (y2n ( j), y2n (k j |l j1 ), x2n (l j1 )) T(n)
Decoding:
At the end of block j + 1, find the unique l j such that (x2n ( l j ), y3n ( j + 1)) T(n)
j such that (x1n (m
j ), x2n ( l j1 ), y2n (k j | l j1 ), y3n ( j)) T(n) for some
Find the unique m
k B( l )
j
j
Elements of NIT
94 / 118
Relay Channel
CompressForward
j) = (X n (L j1 ), Y n (k j |L j1 ), Y n ( j)) T (n)
E(
for all k j [1 : 2nR2 ]
2
2
2
E1 ( j 1) = {L j1 = L j1 }
E1 ( j) = {L j = L j }
Elements of NIT
95 / 118
Relay Channel
CompressForward
j) = (X n (L j1 ), Y n (k j |L j1 ), Y n ( j)) T (n)
E(
for all k j [1 : 2nR2 ]
2
2
2
E1 ( j 1) = {L j1 = L j1 }
E1 ( j) = {L j = L j }
Elements of NIT
96 / 118
Relay Channel
CompressForward
Covering Lemma
p(u, x, x) and <
Let (U , X, X)
Covering Lemma
There exists () 0 as 0 such that
lim P(U n , X n , X n (m)) T(n) for all m A = 0,
) + ()
if R > I(X; X|U
Elements of NIT
97 / 118
Relay Channel
CompressForward
j) = (X n (L j1 ), Y n (k j |L j1 ), Y n ( j)) T (n)
E(
for all k j [1 : 2nR2 ]
2
2
2
E1 ( j 1) = {L j1 = L j1 }
E1 ( j) = {L j = L j }
Elements of NIT
98 / 118
Relay Channel
CompressForward
j) = (X n (L j1 ), Y n (k j |L j1 ), Y n ( j)) T (n)
E(
for all k j [1 : 2nR2 ]
2
2
2
E1 ( j 1) = {L j1 = L j1 }
E1 ( j) = {L j = L j }
Elements of NIT
99 / 118
Relay Channel
CompressForward
Summary
1. Typical Sequences
2. Point-to-Point Communication
3. Multiple Access Channel
4. Broadcast Channel
5. Lossy Source Coding
6. WynerZiv Coding
Coherent cooperation
7. GelfandPinsker Coding
Decodeforward
8. Wiretap Channel
Backward decoding
9. Relay Channel
Compressforward
Elements of NIT
100 / 118
Multicast Network
1
p(y1 , . . . , yN |x1 , . . . , xN )
j
M
N
M
k
M
Elements of NIT
101 / 118
Multicast Network
1
p(y1 , . . . , yN |x1 , . . . , xN )
j
M
N
M
k
M
k = M for some k D}
Average probability of error: Pe(n) = P{M
Elements of NIT
102 / 118
Multicast Network
Network DecodeForward
Network DecodeForward
Decodeforward for RC can be extended to MN
Y2 : X2
M j1
Y3 : X3
Mj
Mj
M j2
Y4
X1
j2
M
Can be improved by removing some relay nodes and relabeling the nodes
El Gamal & Kim (Stanford & UCSD)
Elements of NIT
103 / 118
Multicast Network
Network DecodeForward
Proof of Achievability
We use block Markov coding and sliding window decoding (Carleial 1982)
We illustrate this scheme for DM-RC
Codebook generation, encoding, and relay encoding: same as before
Block
b1
X1
x1n (1|mb1 )
Y2
1
m
2
m
3
m
b1
m
X2
x2n (1)
1)
x2n (m
2)
x2n (m
b2 )
x2n (m
b1 )
x2n (m
Y3
1
m
2
m
b2
m
b1
m
Decoding:
j such that
At the end of block j + 1, find the unique m
n
n
n
(n)
n
j |m
j1 ), x2 (m
j1 ), y3 ( j)) T and (x2 (m
j ), y3n ( j + 1)) T(n) simultaneously
(x1 (m
Elements of NIT
104 / 118
Multicast Network
Network DecodeForward
j1 = 1}
E( j 1) = {M
j |M
j1 ), X n (M
j1 ), Y n ( j)) T (n) or (X n (M
j ), Y n ( j + 1)) T (n)
E1 ( j) = (X1n (M
2
3
2
3
j1 ), X n (M
j1 ), Y n ( j)) T (n) and (X n (m j ), Y n ( j + 1)) T (n)
E2 ( j) = (X1n (m j | M
2
3
2
3
for some m j = M j
j 1)) + P(E(
j)) + P(E( j 1))
P(E(
c
+ P(E1 ( j) E ( j 1) Ec ( j) E c ( j 1)) + P(E2 ( j) Ec ( j))
Elements of NIT
105 / 118
Multicast Network
Network DecodeForward
j = 1
(X2n (m j ), Y3n ( j + 1)) T(n) for some m j = 1, and M
j1 ), X n (M
j1 ), Y n ( j)) T (n) ,
P(X1n (m j | M
2
3
m =1
n
n
(n)
(X (m j ), Y ( j + 1)) T , and M j = 1
2
j1 ), X n (M
j1 ), Y n ( j)) T (n) and M
j = 1
= P(X1n (m j | M
2
3
m =1
j = 1
P(X n (m j ), Y n ( j + 1)) T (n) | M
(a)
j1 ), X n (M
j1 ), Y n ( j)) T (n)
P(X1n (m j | M
2
3
m =1
n
n
(n)
P(X (m j ), Y ( j + 1)) T | M j = 1
2 2
j1 ), X n (M
j1 ), Y n ( j)) T (n) } and {(X n (m j ), Y n ( j + 1)) T (n) } are
(a) {(X1n (m j |M
2
3
2
3
j = 1 for m j = 1
conditionally independent given M
(b) independence of the codebooks and the joint typicality lemma
El Gamal & Kim (Stanford & UCSD)
Elements of NIT
106 / 118
Multicast Network
min
kD S:1S,kS
I(X(S); Y(S
X N , Y (S c ), Yk ),
where the maximum is over all Nk=1 p(xk )p( yk |yk , xk ), Y1 = by convention,
X(S) denotes inputs in S, and Y(S c ) denotes outputs in S c
Special cases:
Can be extended to Gaussian networks (giving best known gap result) and to
multiple messages (LimKimEl GamalChung 2011)
El Gamal & Kim (Stanford & UCSD)
Elements of NIT
107 / 118
Multicast Network
Proof of Achievability
We use several new ideas beyond compressforward for DM-RC
The source node sends the same message m [1 : 2nbR ] over b blocks
Relay node j sends the index of the compressed version Y jn of Y jn without binning
Each receiver node performs simultaneous nonunique decoding of the message and
compression indices from all b blocks
Fix p(x1 )p(x2 )p( y2 |y2 , x2 ) that attains the lower bound
For each j [1 : b], randomly and independently generate 2nbR sequences
n
x1n ( j, m) i=1 pX1 (x1i ), m [1 : 2nbR ]
Randomly and independently generate 2nR2 sequences x2n (l j1 ) i=1 pX2 (x2i ),
l j1 [1 : 2nR2 ]
n
Elements of NIT
108 / 118
Multicast Network
Block
b1
X1
x1n (1, m)
x1n (2, m)
x1n (3, m)
x1n (b 1, m)
x1n (b, m)
Y2
X2
x2n (1)
x2n (l1 )
x2n (l2 )
x2n (lb2 )
x2n (lb1 )
Y3
Encoding:
Relay encoding:
At the end of block j, find an index l j such that (y2n ( j), y2n (l j |l j1 ), x2n (l j1 )) T(n)
Decoding:
such that
At the end of block b, find the unique m
x2n (l j1 ), y2n (l j |l j1 ), y3n ( j)) T(n) for all j [1 : b] for some l1 , l2 , . . . , lb
(x1n ( j, m),
Elements of NIT
109 / 118
Multicast Network
The decoder makes an error only if one or more of the following events occur:
E1 = (Y2n ( j), Y2n (l j |1), X2n (1)) T(n)
for all l j for some j [1 : b]
E2 = (X1n ( j, 1), X2n (1), Y2n (1|1), Y3n ( j)) T(n) for some j [1 : b]
E3 = (X1n ( j, m), X2n (l j1 ), Y2n (l j | l j1 ), Y3n ( j)) T(n) for all j for some l b , m = 1
Thus, the probability of error is upper bounded as
P(E) P(E1 ) + P(E2 E1c ) + P(E3 )
By the covering lemma and the union of events bound (over b blocks),
P(E1 ) 0 as n if R2 > I(Y2 ; Y2 |X2 ) + ( )
By the conditional typicality lemma and the union of events bound,
P(E2 E1c ) 0 as n
Elements of NIT
110 / 118
Multicast Network
P(E3 ) = P Ej (m, l j1 , l j )
m=1 l j=1
b
P Ej (m, l j1 , l j )
j=1
m=1 l
= P(Ej (m, l j1 , l j ))
m=1 l j=1
b
P(Ej (m, l j1 , l j ))
m=1 l j=2
n( I(X1 ; Y2 , Y3 | X2 )())
I2
j=2
Elements of NIT
111 / 118
Multicast Network
m=1 l j=2
m=1 l l 1 j=2
b1
m=1 l j=0
b1
=
m=1 l j=0
b1
m=1 l j=0
Elements of NIT
112 / 118
Multicast Network
Summary
1. Typical Sequences
2. Point-to-Point Communication
3. Multiple Access Channel
4. Broadcast Channel
5. Lossy Source Coding
Network decodeforward
6. WynerZiv Coding
Sliding window decoding
7. GelfandPinsker Coding
Noisy network coding
8. Wiretap Channel
Sending same message multiple times
using independent codebooks
9. Relay Channel
10. Multicast Network
El Gamal & Kim (Stanford & UCSD)
Elements of NIT
113 / 118
Conclusion
Conclusion
Presented a unified approach to achievability proofs for DM networks:
Although the theory is far from complete, we hope that our approach will
Elements of NIT
114 / 118
References
References
Ahlswede, R. (1971). Multiway communication channels. In Proc. 2nd Int. Symp. Inf. Theory,
Tsahkadsor, Armenian SSR, pp. 2352.
Ahlswede, R., Cai, N., Li, S.-Y. R., and Yeung, R. W. (2000). Network information flow. IEEE
Trans. Inf. Theory, 46(4), 12041216.
Avestimehr, A. S., Diggavi, S. N., and Tse, D. N. C. (2011). Wireless network information flow:
A deterministic approach. IEEE Trans. Inf. Theory, 57(4), 18721905.
Bergmans, P. P. (1973). Random coding theorem for broadcast channels with degraded
components. IEEE Trans. Inf. Theory, 19(2), 197207.
Carleial, A. B. (1982). Multiple-access channels with different generalized feedback signals. IEEE
Trans. Inf. Theory, 28(6), 841850.
Costa, M. H. M. (1983). Writing on dirty paper. IEEE Trans. Inf. Theory, 29(3), 439441.
Cover, T. M. (1972). Broadcast channels. IEEE Trans. Inf. Theory, 18(1), 214.
Cover, T. M. and El Gamal, A. (1979). Capacity theorems for the relay channel. IEEE Trans.
Inf. Theory, 25(5), 572584.
Csisz
ar, I. and K
orner, J. (1978). Broadcast channels with confidential messages. IEEE Trans.
Inf. Theory, 24(3), 339348.
Dana, A. F., Gowaikar, R., Palanki, R., Hassibi, B., and Effros, M. (2006). Capacity of wireless
erasure networks. IEEE Trans. Inf. Theory, 52(3), 789804.
Elements of NIT
115 / 118
References
References (cont.)
El Gamal, A., Mohseni, M., and Zahedi, S. (2006). Bounds on capacity and minimum
energy-per-bit for AWGN relay channels. IEEE Trans. Inf. Theory, 52(4), 15451561.
Elias, P., Feinstein, A., and Shannon, C. E. (1956). A note on the maximum flow through a
network. IRE Trans. Inf. Theory, 2(4), 117119.
Ford, L. R., Jr. and Fulkerson, D. R. (1956). Maximal flow through a network. Canad. J. Math.,
8(3), 399404.
Gelfand, S. I. and Pinsker, M. S. (1980). Coding for channel with random parameters. Probl.
Control Inf. Theory, 9(1), 1931.
Han, T. S. and Kobayashi, K. (1981). A new achievable rate region for the interference channel.
IEEE Trans. Inf. Theory, 27(1), 4960.
Heegard, C. and El Gamal, A. (1983). On the capacity of computer memories with defects. IEEE
Trans. Inf. Theory, 29(5), 731739.
Kramer, G., Gastpar, M., and Gupta, P. (2005). Cooperative strategies and capacity theorems
for relay networks. IEEE Trans. Inf. Theory, 51(9), 30373063.
Liao, H. H. J. (1972). Multiple access channels. Ph.D. thesis, University of Hawaii, Honolulu, HI.
Lim, S. H., Kim, Y.-H., El Gamal, A., and Chung, S.-Y. (2011). Noisy network coding. IEEE
Trans. Inf. Theory, 57(5), 31323152.
McEliece, R. J. (1977). The Theory of Information and Coding. Addison-Wesley, Reading, MA.
Elements of NIT
116 / 118
References
References (cont.)
Orlitsky, A. and Roche, J. R. (2001). Coding for computing. IEEE Trans. Inf. Theory, 47(3),
903917.
Ratnakar, N. and Kramer, G. (2006). The multicast capacity of deterministic relay networks with
no interference. IEEE Trans. Inf. Theory, 52(6), 24252432.
Shannon, C. E. (1948). A mathematical theory of communication. Bell Syst. Tech. J., 27(3),
379423, 27(4), 623656.
Shannon, C. E. (1959). Coding theorems for a discrete source with a fidelity criterion. In IRE
Int. Conv. Rec., vol. 7, part 4, pp. 142163. Reprint with changes (1960). In R. E. Machol
(ed.) Information and Decision Processes, pp. 93126. McGraw-Hill, New York.
Shannon, C. E. (1961). Two-way communication channels. In Proc. 4th Berkeley Symp. Math.
Statist. Probab., vol. I, pp. 611644. University of California Press, Berkeley.
Slepian, D. and Wolf, J. K. (1973a). Noiseless coding of correlated information sources. IEEE
Trans. Inf. Theory, 19(4), 471480.
Slepian, D. and Wolf, J. K. (1973b). A coding theorem for multiple access channels with
correlated sources. Bell Syst. Tech. J., 52(7), 10371076.
Willems, F. M. J. and van der Meulen, E. C. (1985). The discrete memoryless multiple-access
channel with cribbing encoders. IEEE Trans. Inf. Theory, 31(3), 313327.
Wyner, A. D. (1975). The wire-tap channel. Bell Syst. Tech. J., 54(8), 13551387.
Elements of NIT
117 / 118
References
References (cont.)
Wyner, A. D. and Ziv, J. (1976). The ratedistortion function for source coding with side
information at the decoder. IEEE Trans. Inf. Theory, 22(1), 110.
Xie, L.-L. and Kumar, P. R. (2005). An achievable rate for the multiple-level relay channel. IEEE
Trans. Inf. Theory, 51(4), 13481358.
Zeng, C.-M., Kuhlmann, F., and Buzo, A. (1989). Achievability proof of some multiuser channel
coding theorems using backward decoding. IEEE Trans. Inf. Theory, 35(6), 11601165.
Elements of NIT
118 / 118