unit i :information theory and source coding · 2019. 1. 22. · source coding theorem shannon’s...
TRANSCRIPT
Unit I :Information Theory
and Source Coding
Dr. Vandana M. Rohokale
Professor
SITS, Pune
1ITCT & CN-Unit 1_2017-1812/11/2017
Syllabus
• Introduction to information theory
• Entropy and its properties
• Discrete Memory less channels, Mutual information
• Source coding theorem
• Huffman coding• Huffman coding
• Shannon-Fano coding
• The Lempel-Ziv algorithm
• Run Length Encoding
• Examples of Source coding-Audio and Video Compression
2ITCT & CN-Unit 1_2017-1812/11/2017
Introduction to Information Theory
Claude Shannon Found Science of Information theory in
1948
• In his 1948 paper, A Mathematical Theory of
Communication, Claude E. Shannon formulated the
theory of data compression. Shannon established that there is
a fundamental to lossless data compression.
12/11/2017 ITCT & CN-Unit 1_2017-18 3
a fundamental to lossless data compression.
• This limit, called the Entropy Rate, is denoted by H. The
exact value of H depends on the information source --- more
specifically, the statistical nature of the source.
• It is possible to compress the source, in a lossless manner,
with compression rate close to H. It is mathematically
impossible to do better than H.
Information theory is where probability theory goes to work for practical living.
Data Information
Data : how information is represented
12/11/2017 ITCT & CN-Unit 1_2017-18 4
Data : how information is represented
Information : what is represented in data
-- tells us something that we did not already know and would not reliably predict
-- contain a certain element of surprise
• Let’s consider these three sentences for developing mathematical
measure of information
Self Information and Mutual Information
12/11/2017 ITCT & CN-Unit 1_2017-18 5
Mutual Information
12/11/2017 ITCT & CN-Unit 1_2017-18 6
),()()(
)|()(
)|()();(
YXHYHXH
XYHYH
YXHXHYXI
−+=
−=
−=
Entropy
• Shannon used the ideas of randomness and entropy from the study of
thermodynamics to estimate the randomness (e.g. information content or entropy) of
a process.
12/11/2017 ITCT & CN-Unit 1_2017-18 7
Properties of Entropy
12/11/2017 ITCT & CN-Unit 1_2017-18 8
Joint Entropy: H(X,Y) = H(X) + H(Y|X)
Numerical Example
12/11/2017 ITCT & CN-Unit 1_2017-18 9
12/11/2017 ITCT & CN-Unit 1_2017-18 10
Channel Capacity
12/11/2017 ITCT & CN-Unit 1_2017-18 11
Source Coding Theorem
Shannon’s Vision
Example - Disk Storage
12/11/2017 ITCT & CN-Unit 1_2017-18 12
Example - Disk Storage
Example – VCD and DVD
Example – Cellular Phone
12/11/2017 ITCT & CN-Unit 1_2017-18 13
Example – Cellular Phone
Shannon showed:
“To reliably store the information generated by some
random source X, you need no more/less than, on the average,
H(X) bits for each outcome.”
Shannon’s Source Coding Theorem
12/11/2017 ITCT & CN-Unit 1_2017-18 14
12/11/2017 ITCT & CN-Unit 1_2017-18 15
Source- Hsiao-feng Francis Lu, “ Introduction to Information Theory”, National Chung-Cheng Univ
12/11/2017 ITCT & CN-Unit 1_2017-18 16
12/11/2017 ITCT & CN-Unit 1_2017-18 17
Source Coding Theorem
12/11/2017 ITCT & CN-Unit 1_2017-18 18
Shannon Fano Coding
12/11/2017 ITCT & CN-Unit 1_2017-18 19
12/11/2017 ITCT & CN-Unit 1_2017-18 20
Huffman Coding
12/11/2017 ITCT & CN-Unit 1_2017-18 21
The Lempel-Ziv algorithm
12/11/2017 ITCT & CN-Unit 1_2017-18 22
12/11/2017 ITCT & CN-Unit 1_2017-18 23
Run Length Encoding
12/11/2017 ITCT & CN-Unit 1_2017-18 24
12/11/2017 ITCT & CN-Unit 1_2017-18 25
26ITCT & CN-Unit 1_2017-18
12/11/2017
Thank You !!!