Lecture20 (March 19)

Lecture 20: Shannon’s two-way channel
Eirik Rosnes
Department of Informatics, University of Bergen,
N-5020 Bergen, Norway
Rosnes (INF 144)
March 19, 2015
1 / 11
Introduction
Shannon’s two-way channel was introduced in 1961 (by Shannon).
Rosnes (INF 144)
March 19, 2015
2 / 11
Introduction
Shannon’s two-way channel was introduced in 1961 (by Shannon).
The channel has two terminals.
Rosnes (INF 144)
March 19, 2015
2 / 11
Introduction
Shannon’s two-way channel was introduced in 1961 (by Shannon).
The channel has two terminals.
The transmission in one direction interferes with the transmission in the other direction.
Rosnes (INF 144)
March 19, 2015
2 / 11
Introduction
Shannon’s two-way channel was introduced in 1961 (by Shannon).
The channel has two terminals.
The transmission in one direction interferes with the transmission in the other direction.
The sources are assumed to be independent.
Rosnes (INF 144)
March 19, 2015
2 / 11
Introduction
Shannon’s two-way channel was introduced in 1961 (by Shannon).
The channel has two terminals.
The transmission in one direction interferes with the transmission in the other direction.
The sources are assumed to be independent.
At terminal 1: X1 is transmitted and Y2 is received at terminal 2.
Rosnes (INF 144)
March 19, 2015
2 / 11
Introduction
Shannon’s two-way channel was introduced in 1961 (by Shannon).
The channel has two terminals.
The transmission in one direction interferes with the transmission in the other direction.
The sources are assumed to be independent.
At terminal 1: X1 is transmitted and Y2 is received at terminal 2.
At terminal 2: X2 is transmitted and Y1 is received at terminal 1.
Rosnes (INF 144)
March 19, 2015
2 / 11
Introduction
Shannon’s two-way channel was introduced in 1961 (by Shannon).
The channel has two terminals.
The transmission in one direction interferes with the transmission in the other direction.
The sources are assumed to be independent.
At terminal 1: X1 is transmitted and Y2 is received at terminal 2.
At terminal 2: X2 is transmitted and Y1 is received at terminal 1.
The problem is to communicate as efficiently as possible in both directions.
Rosnes (INF 144)
March 19, 2015
2 / 11
Example
All input and output signals are binary, and Y1 = Y2 = X1 + X2 (mod 2).
Rosnes (INF 144)
March 19, 2015
3 / 11
Example
All input and output signals are binary, and Y1 = Y2 = X1 + X2 (mod 2).
Observe that if we invert Y2 every time X2 = 1, then Y2 = X1 (correct decoding).
Rosnes (INF 144)
March 19, 2015
3 / 11
Example
All input and output signals are binary, and Y1 = Y2 = X1 + X2 (mod 2).
Observe that if we invert Y2 every time X2 = 1, then Y2 = X1 (correct decoding).
Similarly, if we invert Y1 every time X1 = 1, then Y1 = X2 (correct decoding).
Rosnes (INF 144)
March 19, 2015
3 / 11
Example
All input and output signals are binary, and Y1 = Y2 = X1 + X2 (mod 2).
Observe that if we invert Y2 every time X2 = 1, then Y2 = X1 (correct decoding).
Similarly, if we invert Y1 every time X1 = 1, then Y1 = X2 (correct decoding).
Thus, the rate pair (R12 , R21 ) = (1, 1) is achievable.
Rosnes (INF 144)
March 19, 2015
3 / 11
Example
All input and output signals are binary, and Y1 = Y2 = X1 + X2 (mod 2).
Observe that if we invert Y2 every time X2 = 1, then Y2 = X1 (correct decoding).
Similarly, if we invert Y1 every time X1 = 1, then Y1 = X2 (correct decoding).
Thus, the rate pair (R12 , R21 ) = (1, 1) is achievable.
Since the rate pairs (1, 0) and (0, 1) are trivially achievable, we get the capacity region
Rosnes (INF 144)
March 19, 2015
3 / 11
Example
All input and output signals are binary, and Y1 = Y2 = X1 + X2 (mod 2).
Observe that if we invert Y2 every time X2 = 1, then Y2 = X1 (correct decoding).
Similarly, if we invert Y1 every time X1 = 1, then Y1 = X2 (correct decoding).
Thus, the rate pair (R12 , R21 ) = (1, 1) is achievable.
Since the rate pairs (1, 0) and (0, 1) are trivially achievable, we get the capacity region
A channel with a rectangular capacity region is called compatible (the information flows do
not interfere with each other).
Rosnes (INF 144)
March 19, 2015
3 / 11
Example (Shannon’s “push-to-talk” channel)
Both input alphabets are ternary ({0, 1, 2}) and both output alphabets are binary ({a, b}).
Rosnes (INF 144)
March 19, 2015
4 / 11
Example (Shannon’s “push-to-talk” channel)
Both input alphabets are ternary ({0, 1, 2}) and both output alphabets are binary ({a, b}).
The channel transition pmf is given by
Rosnes (INF 144)
March 19, 2015
4 / 11
Example (Shannon’s “push-to-talk” channel)
Both input alphabets are ternary ({0, 1, 2}) and both output alphabets are binary ({a, b}).
The channel transition pmf is given by
If x1 = 0 and we only use x2 = 1 or 2, one bit can be transmitted without errors in the
direction 2 → 1. (R12 , R21 ) = (0, 1) is achievable.
Rosnes (INF 144)
March 19, 2015
4 / 11
Example (Shannon’s “push-to-talk” channel)
Both input alphabets are ternary ({0, 1, 2}) and both output alphabets are binary ({a, b}).
The channel transition pmf is given by
If x1 = 0 and we only use x2 = 1 or 2, one bit can be transmitted without errors in the
direction 2 → 1. (R12 , R21 ) = (0, 1) is achievable.
Similarly, if x2 = 0 and we only use x1 = 1 or 2, one bit can be transmitted without errors in
the direction 1 → 2. (R12 , R21 ) = (1, 0) is achievable.
Rosnes (INF 144)
March 19, 2015
4 / 11
Example (Shannon’s “push-to-talk” channel)
Both input alphabets are ternary ({0, 1, 2}) and both output alphabets are binary ({a, b}).
The channel transition pmf is given by
If x1 = 0 and we only use x2 = 1 or 2, one bit can be transmitted without errors in the
direction 2 → 1. (R12 , R21 ) = (0, 1) is achievable.
Similarly, if x2 = 0 and we only use x1 = 1 or 2, one bit can be transmitted without errors in
the direction 1 → 2. (R12 , R21 ) = (1, 0) is achievable.
In all other cases, all four output pairs (y1 , y2 ) are equally likely.
Rosnes (INF 144)
March 19, 2015
4 / 11
Example (Shannon’s “push-to-talk” channel)
Both input alphabets are ternary ({0, 1, 2}) and both output alphabets are binary ({a, b}).
The channel transition pmf is given by
If x1 = 0 and we only use x2 = 1 or 2, one bit can be transmitted without errors in the
direction 2 → 1. (R12 , R21 ) = (0, 1) is achievable.
Similarly, if x2 = 0 and we only use x1 = 1 or 2, one bit can be transmitted without errors in
the direction 1 → 2. (R12 , R21 ) = (1, 0) is achievable.
In all other cases, all four output pairs (y1 , y2 ) are equally likely.
The capacity region is given by
Rosnes (INF 144)
March 19, 2015
4 / 11
Example (Shannon’s “push-to-talk” channel)
Both input alphabets are ternary ({0, 1, 2}) and both output alphabets are binary ({a, b}).
The channel transition pmf is given by
If x1 = 0 and we only use x2 = 1 or 2, one bit can be transmitted without errors in the
direction 2 → 1. (R12 , R21 ) = (0, 1) is achievable.
Similarly, if x2 = 0 and we only use x1 = 1 or 2, one bit can be transmitted without errors in
the direction 1 → 2. (R12 , R21 ) = (1, 0) is achievable.
In all other cases, all four output pairs (y1 , y2 ) are equally likely.
The capacity region is given by
This is an incompatible channel.
Rosnes (INF 144)
March 19, 2015
4 / 11
Shannon’s two-way channel
In general, Shannon’s two-way channel is neither compatible nor incompatible, but
something in between these two extremes.
Rosnes (INF 144)
March 19, 2015
5 / 11
Shannon’s two-way channel
In general, Shannon’s two-way channel is neither compatible nor incompatible, but
something in between these two extremes.
The capacity region of the general channel, denoted by K22 , is still unknown.
Rosnes (INF 144)
March 19, 2015
5 / 11
Shannon’s two-way channel
In general, Shannon’s two-way channel is neither compatible nor incompatible, but
something in between these two extremes.
The capacity region of the general channel, denoted by K22 , is still unknown.
0 , is depicted in the figure below.
The limited two-way channel, denoted by K22
Rosnes (INF 144)
March 19, 2015
5 / 11
Shannon’s two-way channel
In general, Shannon’s two-way channel is neither compatible nor incompatible, but
something in between these two extremes.
The capacity region of the general channel, denoted by K22 , is still unknown.
0 , is depicted in the figure below.
The limited two-way channel, denoted by K22
The capacity region of this channel is known and constitutes an inner bound for C(K22 ).
Rosnes (INF 144)
March 19, 2015
5 / 11
Shannon’s two-way channel
In general, Shannon’s two-way channel is neither compatible nor incompatible, but
something in between these two extremes.
The capacity region of the general channel, denoted by K22 , is still unknown.
0 , is depicted in the figure below.
The limited two-way channel, denoted by K22
The capacity region of this channel is known and constitutes an inner bound for C(K22 ).
There is also an outer bound SO (K22 ) that is simple to compute.
Rosnes (INF 144)
March 19, 2015
5 / 11
Shannon’s two-way channel
In general, Shannon’s two-way channel is neither compatible nor incompatible, but
something in between these two extremes.
The capacity region of the general channel, denoted by K22 , is still unknown.
0 , is depicted in the figure below.
The limited two-way channel, denoted by K22
The capacity region of this channel is known and constitutes an inner bound for C(K22 ).
There is also an outer bound SO (K22 ) that is simple to compute.
Both SO (K22 ) and C(K22 ) are convex.
Rosnes (INF 144)
March 19, 2015
5 / 11
Shannon’s limited two-way channel
Let
SO (K22 ) = {(I(X1 ; Y2 |X2 ), I(X2 ; Y1 |X1 )); fX1 X2 (x1 , x2 ) arbitrary}
SI (K22 ) = {(I(X1 ; Y2 |X2 ), I(X2 ; Y1 |X1 )); fX1 X2 (x1 , x2 ) = fX1 (x1 )fX2 (x2 )}
Rosnes (INF 144)
March 19, 2015
6 / 11
Shannon’s limited two-way channel
Let
SO (K22 ) = {(I(X1 ; Y2 |X2 ), I(X2 ; Y1 |X1 )); fX1 X2 (x1 , x2 ) arbitrary}
SI (K22 ) = {(I(X1 ; Y2 |X2 ), I(X2 ; Y1 |X1 )); fX1 X2 (x1 , x2 ) = fX1 (x1 )fX2 (x2 )}
For the general discrete memoryless two-way channel, it holds that
conv(SI (K22 )) ⊆ C(K22 ) ⊆ SO (K22 )
Rosnes (INF 144)
March 19, 2015
6 / 11
Shannon’s limited two-way channel
Let
SO (K22 ) = {(I(X1 ; Y2 |X2 ), I(X2 ; Y1 |X1 )); fX1 X2 (x1 , x2 ) arbitrary}
SI (K22 ) = {(I(X1 ; Y2 |X2 ), I(X2 ; Y1 |X1 )); fX1 X2 (x1 , x2 ) = fX1 (x1 )fX2 (x2 )}
For the general discrete memoryless two-way channel, it holds that
conv(SI (K22 )) ⊆ C(K22 ) ⊆ SO (K22 )
For the limited discrete memoryless two-way channel, it holds that
0
C(K22
) = conv(SI (K22 ))
Rosnes (INF 144)
March 19, 2015
6 / 11
Shannon’s limited two-way channel
Let
SO (K22 ) = {(I(X1 ; Y2 |X2 ), I(X2 ; Y1 |X1 )); fX1 X2 (x1 , x2 ) arbitrary}
SI (K22 ) = {(I(X1 ; Y2 |X2 ), I(X2 ; Y1 |X1 )); fX1 X2 (x1 , x2 ) = fX1 (x1 )fX2 (x2 )}
For the general discrete memoryless two-way channel, it holds that
conv(SI (K22 )) ⊆ C(K22 ) ⊆ SO (K22 )
For the limited discrete memoryless two-way channel, it holds that
0
C(K22
) = conv(SI (K22 ))
Shannon has shown that for a large class of channels with a sufficient degree of symmetry,
0 ).
the outer and inner bounds coincide, from which it follows that C(K22 ) = C(K22
Rosnes (INF 144)
March 19, 2015
6 / 11
Shannon’s limited two-way channel
Let
SO (K22 ) = {(I(X1 ; Y2 |X2 ), I(X2 ; Y1 |X1 )); fX1 X2 (x1 , x2 ) arbitrary}
SI (K22 ) = {(I(X1 ; Y2 |X2 ), I(X2 ; Y1 |X1 )); fX1 X2 (x1 , x2 ) = fX1 (x1 )fX2 (x2 )}
For the general discrete memoryless two-way channel, it holds that
conv(SI (K22 )) ⊆ C(K22 ) ⊆ SO (K22 )
For the limited discrete memoryless two-way channel, it holds that
0
C(K22
) = conv(SI (K22 ))
Shannon has shown that for a large class of channels with a sufficient degree of symmetry,
0 ).
the outer and inner bounds coincide, from which it follows that C(K22 ) = C(K22
This is the case for the channels of the two previous examples.
Rosnes (INF 144)
March 19, 2015
6 / 11
Blackwell’s binary multiplication channel (BMC)
The capacity region is still unknown.
Rosnes (INF 144)
March 19, 2015
7 / 11
Blackwell’s binary multiplication channel (BMC)
The capacity region is still unknown.
Assuming X1 and X2 are independent, we get
I(X1 ; Y2 |X2 ) = H(Y2 |X2 ) − H(Y2 |X1 X2 ) = H(Y2 |X2 )
Rosnes (INF 144)
March 19, 2015
7 / 11
Blackwell’s binary multiplication channel (BMC)
The capacity region is still unknown.
Assuming X1 and X2 are independent, we get
I(X1 ; Y2 |X2 ) = H(Y2 |X2 ) − H(Y2 |X1 X2 ) = H(Y2 |X2 )
X
=
fX2 (x2 )H(Y2 |X2 = x2 ) = fX2 (1)H(X1 |X2 = 1) = fX2 (1)H(X1 )
x2
Rosnes (INF 144)
March 19, 2015
7 / 11
Blackwell’s binary multiplication channel (BMC)
The capacity region is still unknown.
Assuming X1 and X2 are independent, we get
I(X1 ; Y2 |X2 ) = H(Y2 |X2 ) − H(Y2 |X1 X2 ) = H(Y2 |X2 )
X
=
fX2 (x2 )H(Y2 |X2 = x2 ) = fX2 (1)H(X1 |X2 = 1) = fX2 (1)H(X1 )
x2
By symmetry:
I(X2 ; Y1 |X1 ) = fX1 (1)H(X2 )
Rosnes (INF 144)
March 19, 2015
7 / 11
Blackwell’s binary multiplication channel (BMC)
The capacity region is still unknown.
Assuming X1 and X2 are independent, we get
I(X1 ; Y2 |X2 ) = H(Y2 |X2 ) − H(Y2 |X1 X2 ) = H(Y2 |X2 )
X
=
fX2 (x2 )H(Y2 |X2 = x2 ) = fX2 (1)H(X1 |X2 = 1) = fX2 (1)H(X1 )
x2
By symmetry:
I(X2 ; Y1 |X1 ) = fX1 (1)H(X2 )
The inner bound is thus given by the envelope of the rate pairs (R12 , R21 ) such that
R12 = p2 h(p1 ) and R21 = p1 h(p2 )
where 0 ≤ p1 , p2 ≤ 1, p1 = fX1 (1), and p2 = fX2 (1).
Rosnes (INF 144)
March 19, 2015
7 / 11
The BMC: Outer bound
The outer bound requires an optimization over a general joint pmf fX1 X2 (x1 , x2 ).
Rosnes (INF 144)
March 19, 2015
8 / 11
The BMC: Outer bound
The outer bound requires an optimization over a general joint pmf fX1 X2 (x1 , x2 ).
Note that we can assume fX1 X2 (0, 0) = 0.
Rosnes (INF 144)
March 19, 2015
8 / 11
The BMC: Outer bound
The outer bound requires an optimization over a general joint pmf fX1 X2 (x1 , x2 ).
Note that we can assume fX1 X2 (0, 0) = 0.
Let p1 = fX1 X2 (1, 0) and p2 = fX1 X2 (0, 1).
Rosnes (INF 144)
March 19, 2015
8 / 11
The BMC: Outer bound
The outer bound requires an optimization over a general joint pmf fX1 X2 (x1 , x2 ).
Note that we can assume fX1 X2 (0, 0) = 0.
Let p1 = fX1 X2 (1, 0) and p2 = fX1 X2 (0, 1).
We have
I(X1 ; Y2 |X2 ) = H(Y2 |X2 ) − H(Y2 |X1 X2 ) = H(Y2 |X2 )
Rosnes (INF 144)
March 19, 2015
8 / 11
The BMC: Outer bound
The outer bound requires an optimization over a general joint pmf fX1 X2 (x1 , x2 ).
Note that we can assume fX1 X2 (0, 0) = 0.
Let p1 = fX1 X2 (1, 0) and p2 = fX1 X2 (0, 1).
We have
I(X1 ; Y2 |X2 ) = H(Y2 |X2 ) − H(Y2 |X1 X2 ) = H(Y2 |X2 )
X
=
fX2 (x2 )H(Y2 |X2 = x2 ) = fX2 (1)H(X1 |X2 = 1)
x2
Rosnes (INF 144)
March 19, 2015
8 / 11
The BMC: Outer bound
The outer bound requires an optimization over a general joint pmf fX1 X2 (x1 , x2 ).
Note that we can assume fX1 X2 (0, 0) = 0.
Let p1 = fX1 X2 (1, 0) and p2 = fX1 X2 (0, 1).
We have
I(X1 ; Y2 |X2 ) = H(Y2 |X2 ) − H(Y2 |X1 X2 ) = H(Y2 |X2 )
X
=
fX2 (x2 )H(Y2 |X2 = x2 ) = fX2 (1)H(X1 |X2 = 1)
x2
Now (since fX2 (1) = 1 − p1 ),
H(X1 |X2 = 1) = −
X
fX1 |X2 (x1 |1) log2 fX1 |X2 (x1 |1)
x1
Rosnes (INF 144)
March 19, 2015
8 / 11
The BMC: Outer bound
The outer bound requires an optimization over a general joint pmf fX1 X2 (x1 , x2 ).
Note that we can assume fX1 X2 (0, 0) = 0.
Let p1 = fX1 X2 (1, 0) and p2 = fX1 X2 (0, 1).
We have
I(X1 ; Y2 |X2 ) = H(Y2 |X2 ) − H(Y2 |X1 X2 ) = H(Y2 |X2 )
X
=
fX2 (x2 )H(Y2 |X2 = x2 ) = fX2 (1)H(X1 |X2 = 1)
x2
Now (since fX2 (1) = 1 − p1 ),
H(X1 |X2 = 1) = −
X
fX1 |X2 (x1 |1) log2 fX1 |X2 (x1 |1)
x1
=−
X fX X (x1 , 1)
1 2
x1
Rosnes (INF 144)
fX2 (1)
log2
fX1 X2 (x1 , 1)
fX2 (1)
March 19, 2015
8 / 11
The BMC: Outer bound
The outer bound requires an optimization over a general joint pmf fX1 X2 (x1 , x2 ).
Note that we can assume fX1 X2 (0, 0) = 0.
Let p1 = fX1 X2 (1, 0) and p2 = fX1 X2 (0, 1).
We have
I(X1 ; Y2 |X2 ) = H(Y2 |X2 ) − H(Y2 |X1 X2 ) = H(Y2 |X2 )
X
=
fX2 (x2 )H(Y2 |X2 = x2 ) = fX2 (1)H(X1 |X2 = 1)
x2
Now (since fX2 (1) = 1 − p1 ),
H(X1 |X2 = 1) = −
X
fX1 |X2 (x1 |1) log2 fX1 |X2 (x1 |1)
x1
=−
X fX X (x1 , 1)
1 2
x1
=−
Rosnes (INF 144)
fX2 (1)
log2
fX1 X2 (x1 , 1)
fX2 (1)
p2
p2
1 − p1 − p2
1 − p1 − p2
log
−
log
1 − p1
1 − p1
1 − p1
1 − p1
March 19, 2015
8 / 11
The BMC: Outer bound
The outer bound requires an optimization over a general joint pmf fX1 X2 (x1 , x2 ).
Note that we can assume fX1 X2 (0, 0) = 0.
Let p1 = fX1 X2 (1, 0) and p2 = fX1 X2 (0, 1).
We have
I(X1 ; Y2 |X2 ) = H(Y2 |X2 ) − H(Y2 |X1 X2 ) = H(Y2 |X2 )
X
=
fX2 (x2 )H(Y2 |X2 = x2 ) = fX2 (1)H(X1 |X2 = 1)
x2
Now (since fX2 (1) = 1 − p1 ),
H(X1 |X2 = 1) = −
X
fX1 |X2 (x1 |1) log2 fX1 |X2 (x1 |1)
x1
=−
X fX X (x1 , 1)
1 2
x1
fX2 (1)
log2
fX1 X2 (x1 , 1)
fX2 (1)
p2
p2
1 − p1 − p2
1 − p1 − p2
log
−
log
1 − p1
1 − p1
1 − p1
1 − p1
p2
=h
1 − p1
=−
Rosnes (INF 144)
March 19, 2015
8 / 11
The BMC: Outer bound
It follows that
I(X1 ; Y2 |X2 ) = (1 − p1 )h
Rosnes (INF 144)
p2
1 − p1
March 19, 2015
9 / 11
The BMC: Outer bound
It follows that
p2
1 − p1
p1
1 − p2
I(X1 ; Y2 |X2 ) = (1 − p1 )h
By symmetry:
I(X2 ; Y1 |X1 ) = (1 − p2 )h
Rosnes (INF 144)
March 19, 2015
9 / 11
The BMC: Outer bound
It follows that
p2
1 − p1
p1
1 − p2
I(X1 ; Y2 |X2 ) = (1 − p1 )h
By symmetry:
I(X2 ; Y1 |X1 ) = (1 − p2 )h
Thus, the outer bound is given by the envelope of the rate pairs (R12 , R21 ) such that
p2
p1
R12 = (1 − p1 )h
and R21 = (1 − p2 )h
1 − p1
1 − p2
where 0 ≤ p1 , p2 ≤ 1 and 0 ≤ p1 + p2 ≤ 1.
Rosnes (INF 144)
March 19, 2015
9 / 11
The BMC: Outer bound
It follows that
p2
1 − p1
p1
1 − p2
I(X1 ; Y2 |X2 ) = (1 − p1 )h
By symmetry:
I(X2 ; Y1 |X1 ) = (1 − p2 )h
Thus, the outer bound is given by the envelope of the rate pairs (R12 , R21 ) such that
p2
p1
R12 = (1 − p1 )h
and R21 = (1 − p2 )h
1 − p1
1 − p2
where 0 ≤ p1 , p2 ≤ 1 and 0 ≤ p1 + p2 ≤ 1.
Rosnes (INF 144)
March 19, 2015
9 / 11
Shannon’s limited two-way channel
In the example of the BMC, we saw that the inner and outer bounds are different.
Rosnes (INF 144)
March 19, 2015
10 / 11
Shannon’s limited two-way channel
In the example of the BMC, we saw that the inner and outer bounds are different.
For the limited two-way channel, X1 and X2 are independent, while for the general channel
they are not; they can be dependent through previous input and output signals, even if the
messages themselves are independent.
Rosnes (INF 144)
March 19, 2015
10 / 11
Shannon’s limited two-way channel
In the example of the BMC, we saw that the inner and outer bounds are different.
For the limited two-way channel, X1 and X2 are independent, while for the general channel
they are not; they can be dependent through previous input and output signals, even if the
messages themselves are independent.
When communicating over the two-way channel we should take into account previously
received symbols when determining the current symbol to be transmitted.
Rosnes (INF 144)
March 19, 2015
10 / 11
Shannon’s limited two-way channel
In the example of the BMC, we saw that the inner and outer bounds are different.
For the limited two-way channel, X1 and X2 are independent, while for the general channel
they are not; they can be dependent through previous input and output signals, even if the
messages themselves are independent.
When communicating over the two-way channel we should take into account previously
received symbols when determining the current symbol to be transmitted.
We need an efficient coding strategy that takes previously received symbols into account
when determining which symbol to transmit.
Rosnes (INF 144)
March 19, 2015
10 / 11
Shannon’s limited two-way channel
In the example of the BMC, we saw that the inner and outer bounds are different.
For the limited two-way channel, X1 and X2 are independent, while for the general channel
they are not; they can be dependent through previous input and output signals, even if the
messages themselves are independent.
When communicating over the two-way channel we should take into account previously
received symbols when determining the current symbol to be transmitted.
We need an efficient coding strategy that takes previously received symbols into account
when determining which symbol to transmit.
An efficient coding strategy can be hard to find.
Rosnes (INF 144)
March 19, 2015
10 / 11
Hagelbarger’s coding strategy for the BMC
Rosnes (INF 144)
March 19, 2015
11 / 11
Hagelbarger’s coding strategy for the BMC
Transmit the symbols 0 and 1 from each terminal independently and with the same
probability.
Rosnes (INF 144)
March 19, 2015
11 / 11
Hagelbarger’s coding strategy for the BMC
Transmit the symbols 0 and 1 from each terminal independently and with the same
probability.
The transmitted symbols may depend on the previously received symbol.
Rosnes (INF 144)
March 19, 2015
11 / 11
Hagelbarger’s coding strategy for the BMC
Transmit the symbols 0 and 1 from each terminal independently and with the same
probability.
The transmitted symbols may depend on the previously received symbol.
The average number of channel symbols per message bit is
(3/4) · 2 + (1/4) · 1 = 7/4
Rosnes (INF 144)
March 19, 2015
11 / 11
Hagelbarger’s coding strategy for the BMC
Transmit the symbols 0 and 1 from each terminal independently and with the same
probability.
The transmitted symbols may depend on the previously received symbol.
The average number of channel symbols per message bit is
(3/4) · 2 + (1/4) · 1 = 7/4
The average rate is thus: 4/7 = 0.571 in both directions (which is higher than what can be
achieved by time-sharing).
Rosnes (INF 144)
March 19, 2015
11 / 11
Hagelbarger’s coding strategy for the BMC
Transmit the symbols 0 and 1 from each terminal independently and with the same
probability.
The transmitted symbols may depend on the previously received symbol.
The average number of channel symbols per message bit is
(3/4) · 2 + (1/4) · 1 = 7/4
The average rate is thus: 4/7 = 0.571 in both directions (which is higher than what can be
achieved by time-sharing).
For more than 20 years is was unknown whether the inner bound exactly determines the
capacity region for the BMC. In 1982, Schalkwijk developed a simple coding strategy giving
the rate pair (R12 , R21 ) = (0.61914, 0.61914) which is strictly outside the inner bound.
Rosnes (INF 144)
March 19, 2015
11 / 11