Digital Communications III (ECE 154C)
Introduction to Coding and Information Theory
Tara Javidi
These lecture notes were originally developed by late Prof. J. K. Wolf.
UC San Diego
Spring 2014
1 / 12
Source Coding
• Entropy
• A Useful Theorem
• A Useful Theorem
• A Useful Theorem
• More on Entropy
• More on Entropy
• Fundamental Limits
• More on Entropy
• Source Coding
Theorem
• Source Coding
Theorem
Noiseless Source Coding
Fundamental Limits
2 / 12
Shannon Entropy
Source Coding
• Entropy
• A Useful Theorem
• A Useful Theorem
• A Useful Theorem
• More on Entropy
• More on Entropy
• Fundamental Limits
• More on Entropy
• Source Coding
Theorem
• Source Coding
Theorem
• Let an I.I.D. source S , have M source letters that occur with
PM
probabilities p1 , p2 , ..., pM ,
i=1 pi = 1
• The entropy of the source S is denoted H(S) and is defined as
Ha (S) =
M
X
i=1
M
X
1
pi loga pi
pi loga
=−
pi
i=1
1
Ha (S) = E loga
pi
• The base of the logarithms is usually taken to be equal to 2. In
that case, H2 (S) is simply written as H(S) and is measured in
units of “bits”.
3 / 12
Shannon Entropy
Source Coding
• Entropy
• A Useful Theorem
• A Useful Theorem
• A Useful Theorem
• More on Entropy
• More on Entropy
• Fundamental Limits
• More on Entropy
• Source Coding
• Other bases can be used and easily transformed to. Note:
loga x = (logb x)(loga b) =
(logb x)
(logb a)
Hence,
Hb (S)
Ha (S) = Hb (S) · loga b =
logb a
Theorem
• Source Coding
Theorem
A Useful Theorem
Let p1 , p2 , ..., pM be one set of probabilities and let p′1 , p′2 , ..., p′M
be another (note
PM
i=1 pi = 1,
M
X
i=1
PM
′ = 1). Then
p
i=1 i
M
X
1
1
pi log ≤
pi log ′ ,
pi
pi
i=1
with equality iff pi = p′i for i = 1, 2, ..., M
4 / 12
Shannon Entropy
Source Coding
• Entropy
• A Useful Theorem
• A Useful Theorem
• A Useful Theorem
• More on Entropy
• More on Entropy
• Fundamental Limits
• More on Entropy
• Source Coding
Proof.
First note that ln x ≤ x − 1 with equality iff x = 1.
Theorem
• Source Coding
Theorem
Placeholder Figure
M
X
i=1
pi log
A
p′i
pi
=
M
X
i=1
pi ln
p′i
pi
!
log e
5 / 12
Shannon Entropy
Source Coding
• Entropy
• A Useful Theorem
• A Useful Theorem
• A Useful Theorem
• More on Entropy
• More on Entropy
• Fundamental Limits
• More on Entropy
• Source Coding
proof continued.
M
X
pi ln
i=1
p′i
pi
!
log e ≤ (log e)
= (log e)
Theorem
• Source Coding
Theorem
M
X
pi
i=1
M
X
pi
p′i −
i=1
=0
p′i
−1
M
X
i=1
pi
!
In other words,
M
X
i=1
M
X
1
1
pi log ′
pi log ≤
pi
pi
i=1
6 / 12
Shannon Entropy
Source Coding
• Entropy
• A Useful Theorem
• A Useful Theorem
• A Useful Theorem
• More on Entropy
• More on Entropy
• Fundamental Limits
• More on Entropy
• Source Coding
Theorem
• Source Coding
Theorem
1.
2.
For an equally likely i.i.d. source with M source letters
H2 (S) = log2 M
(aHa (S) = loga M, any a)
For any i.i.d. source with M source letters
0 ≤ H2 (S) ≤ log2 M,
3.
for alla
1
This follows from previous theorem with p′i = M
all i.
Consider an iid source with source alphabet S . If we consider
encoding M source letters at a time, this is an iid source with
alphabet S M for source letters. Call this the M tuple extension
of the source and denote it is by S M . Then
H2 (S M ) = mH2 (S),
(and similarly for all a, Ha (S M ) = mHa (S)).
The proofs are omitted but are easy.
7 / 12
Computation of Entropy (base 2)
Source Coding
• Entropy
• A Useful Theorem
• A Useful Theorem
• A Useful Theorem
• More on Entropy
• More on Entropy
• Fundamental Limits
• More on Entropy
• Source Coding
Theorem
• Source Coding
Theorem
Example 1: Consider a source with M = 2 and
(p1 , p2 ) = (0.9, 0.1).
1
1
+ .1 log2
= 0.469bits
.9
1.1
H2 (S) = .9 log2
From before we gave Huffman Codes for this source and extensions
of this source for which
L1 = 1
L 2 = 0.645
2
L 3 = 0.533
3
L 4 = 0.49
4
In other words, we note that L m ≥ H2 (s). Furthermore, as m gets
m
m
is getting closer to
larger Im
H2 (S).
8 / 12
Performance of Huffman Codes
Source Coding
• Entropy
• A Useful Theorem
• A Useful Theorem
• A Useful Theorem
• More on Entropy
• More on Entropy
• Fundamental Limits
• More on Entropy
• Source Coding
Theorem
• Source Coding
Theorem
−→ One can prove that, in general, for a binary Huffman Code,
1
m < H2 (S) +
H2 (S) ≤ L m
m
Example 2: Consider a source with M = 3 and
(p1 , p2 , p3 ) = (.5, .35, .15).
1
1
1
+ .15 log2
= 1.44 bits
H2 (S) = .5 log2 + .35 log2
.5
.35
.15
Again we have already given codes for this source such that
1 symbol at a time L1 = 1.5
2 symbols at a time L 2 = 1.46
2
9 / 12
Performance of Huffman Codes
Source Coding
• Entropy
• A Useful Theorem
• A Useful Theorem
• A Useful Theorem
• More on Entropy
• More on Entropy
• Fundamental Limits
• More on Entropy
• Source Coding
Theorem
• Source Coding
Theorem
Example 3: Consider M = 4 and (p1 , p2 , p3 , p4 ) = ( 12 , 14 , 18 , 81 ).
H2 (S) =
1
1
1
1
1
1
1
1
log2
+ log2
+ log2
+ log2
2
1/2 4
1/4 8
1/8 8
1/8
= 1.75bits
But from before we gave the code
Source Symbols Codewords
A
0
B
10
C
110
D
111
for which L1 = H2 (S).
This means that one cannot improve on the efficiency of this
Huffman code by encoding several source symbols at a time.
10 / 12
Fundamental Source Coding Limit
Source Coding
• Entropy
• A Useful Theorem
• A Useful Theorem
• A Useful Theorem
• More on Entropy
• More on Entropy
• Fundamental Limits
• More on Entropy
• Source Coding
Theorem
• Source Coding
Theorem
Theorem [Lossless Source Coding Theorem]
For any U.D. Binary Code corresponding to the N th extension of the
I.I.D. Source S , for every N = 1, 2, ...
LN
≥ H(S)
N
• For a binary Huffman Code corresponding to the N th extension
of the I.I.D. Source S
LN
1
H2 (S) +
>
≥ H2 (S)
m
N
• But this implies that Huffman coding is asymptotically optimal, i.e.
LN
−→ H2 (S)
lim
N →∞ N
11 / 12
NON-BINARY CODE WORDS
Source Coding
• Entropy
• A Useful Theorem
• A Useful Theorem
• A Useful Theorem
• More on Entropy
• More on Entropy
• Fundamental Limits
• More on Entropy
• Source Coding
Theorem
• Source Coding
Theorem
The code symbols that make up the codewords can be from a higher
order alphabet than 2.
Example 4: I.I.D. Source {A,B,C,D,E} with U.D. codes (where each
concatenation of code words can be decoded in only one way):
SYMBOLS
A
B
C
D
E
TERNARY
QUATERNARY
0
1
20
21
22
0
1
2
30
31
5-ary
0
1
2
3
4
• A lower bound to the average code length of any U.D. n-ary code
(with n-letters) is Hn (S)
• For example, for a ternary code, the average length (per source
L
M
, is no less than H3 (S).
letter), M
12 / 12
© Copyright 2025