prepared by: engr. jo-ann c. viñas 1 module 2 entropy
TRANSCRIPT
Prepared by: Engr. Jo-Ann C. Viñas 1
MODULE 2
ENTROPY
Prepared by: Engr. Jo-Ann C. Viñas 2
OBJECTIVES:
1. Define Information
2. Discuss the Characteristics of Information
3. Introduce Types of Information Sources and Types of Communication System
4. Explain Information Theory
Prepared by: Engr. Jo-Ann C. Viñas 3
INFORMATION
- Is defined as knowledge or intelligence communicated or received
- a numerical quantity that measures the uncertainty in the outcome of an experiment to be performed
Prepared by: Engr. Jo-Ann C. Viñas 4
CHARACTERISTICS OF INFORMATION
1. Only the quantity of information and its integrity are important, not the meaning
2. No information is transmitted by a continuous symbol
3. Information requires change
Prepared by: Engr. Jo-Ann C. Viñas 5
TYPES OF INFORMATION SOURCES
1. Analog Information Source
- produces messages that are defined on a continuum
2. Digital Information Source
- produces a finite set of possible messages
Prepared by: Engr. Jo-Ann C. Viñas 6
UNITS OF INFORMATION
1. Bits
2. Dits
3. Nats
Prepared by: Engr. Jo-Ann C. Viñas 7
INFORMATION TRANSFER RATE- number of binary digits (bits) that is
transmitted in unit of time
- is expressed in bits per second
- often called bit rate
Prepared by: Engr. Jo-Ann C. Viñas 8
SIGNALLING RATE
- is the rate at which transmission changes occur
- specifies how fast the signal states change in a communication channel
- often called baud rate
Prepared by: Engr. Jo-Ann C. Viñas 9
CHARACTERISTICS OF A SQUARE WAVE1. Has a frequency spectrum that contains the
fundamental frequency and all its odd harmonics
2. The magnitude of the harmonic components decreases as the order goes up.
3. In practice, significant amplitude of the harmonics is limited to about 9
Prepared by: Engr. Jo-Ann C. Viñas 10
CHARACTERISTICS OF A PULSE TRAINS1. Has a spectrum with many components
2. Frequencies of these components and the bandwidth required to convey the signal depend on the pattern of the bits in the pulse train
3. The pattern that requires the greatest bandwidth is that where alternate bitschange state
Prepared by: Engr. Jo-Ann C. Viñas 11
THINGS TO REMEMBER
1. A practical communication channel has to be capable of conveying any pattern of data transmitted
2. The absolute minimum bandwidth of a channel is theoretically the fundamental frequency of the
square waveform
3. In practice, the bandwidth used is greater than this absolute minimum
Prepared by: Engr. Jo-Ann C. Viñas 12
DATA
- a form of information that is suitable for storage in or by processing by a
computer
Prepared by: Engr. Jo-Ann C. Viñas 13
INFORMATION THEORY
- ideal amount of data that should be transmitted to enable the data to be efficiently transmitted without transmitting the redundant data
Prepared by: Engr. Jo-Ann C. Viñas 14
PROPERTIES OF QUANTITATIVE MEASURE OF INFORMATION1. If a particular message is known by the user prior
to being transmitted, the message contains zero information.
2. If potential messages from a source are all equally likely, then the information contained in each particular message should be equal to the number of “1”s and “0 s required to uniquely identify the message.
3. If two potential messages are not equally likely messages, the one with lesser probability contains the greater amount of information.
Prepared by: Engr. Jo-Ann C. Viñas 15
I. INFORMATION MEASURE (Ii)
The information sent from a digital source when the ith message is transmitted is given by:
where:Pi - probability of transmitting the ith message
Prepared by: Engr. Jo-Ann C. Viñas 16
EXAMPLE 1
Suppose that equal numbers of letter grades A, B, C, D, and F are given in a certain course. How much information in bits have you received when the instructor tells you that your grade is:
a. not F?
b. Either A or B
c. repeat a and b (solve amount of information in terms of dits)
Prepared by: Engr. Jo-Ann C. Viñas 17
EXAMPLE 2
A card is drawn at random from an ordinary deck of 52 playing cards. Find
a) the information in bits that you receive when
you are told that the card is a heart
b) a face card
c) a heart face card
Prepared by: Engr. Jo-Ann C. Viñas 18
EXAMPLE 3
Find the information content of message that consists of a digital word 12 digits long in which each digit may take on one of four possible levels. The probability of sending any of the four levels is assumed to be equal, and the level in any digit does not depend on the values taken on by previous digits.
Prepared by: Engr. Jo-Ann C. Viñas 19
EXAMPLE 4
Consider a source flipping a coin. How much information is contained in the message “the coin landed heads up”?
Prepared by: Engr. Jo-Ann C. Viñas 20
EXAMPLE 5
Consider a fast-food restaurant in which a customer is nine times as likely to order a hamburger as a fish sandwich. How much information is contained in the message “the customer wants a hamburger?” How much information is contained in the message “the customer wants a fish sandwich?”
Prepared by: Engr. Jo-Ann C. Viñas 21
EXAMPLE 6
How much information is contained in the message “you are reading this example”?
Prepared by: Engr. Jo-Ann C. Viñas 22
II. AVERAGE INFORMATION
- Average information content of a message from a particular source.
- expected symbols per second
Prepared by: Engr. Jo-Ann C. Viñas 23
III. RELATIVE ENTROPY
The ratio of the entropy of a source to the maximum value the entropy could take for the same source symbol
where:Hmax = log b NN = total number of symbols
Prepared by: Engr. Jo-Ann C. Viñas 24
IV. REDUNDANCY
Prepared by: Engr. Jo-Ann C. Viñas 25
V. RATE OF INFORMATION
The ratio of the entropy of a source to the maximum value the entropy could take for the same source symbol
Prepared by: Engr. Jo-Ann C. Viñas 26
EXAMPLE 1
A telephone touch-tone keypad has the digits 0 to 9, plus the * and # keys. Assume the probability of sending * and # is 0.005 and the probability of sending 0 to 9 is 0.0999 each. If the keys are pressed at a rate of 2 keys/sec, compute the entropy and data rate for this source.
Prepared by: Engr. Jo-Ann C. Viñas 27
EXAMPLE 2
Determine the following:a) Entropyb) Relative Entropyc) Rate of Information
Prepared by: Engr. Jo-Ann C. Viñas 28
EXAMPLE 3 Determine the ideal number of bits that should be
allocated to each of the following characters with theprobabilities given.
Prepared by: Engr. Jo-Ann C. Viñas 29
EXAMPLE 4
Consider a transmission that is to transmit the first 6 characters of the alphabet only. Each will be expressed as a digital signal. By convention, each letter would be allocated 3 bits.
Find the entropy to determine the most economical way for transmitting the data.
Prepared by: Engr. Jo-Ann C. Viñas 30
EXAMPLE 5
Suppose our fast-food restaurant serves an average of eight customers per minute. What is the information rate of the food orders?
Prepared by: Engr. Jo-Ann C. Viñas 31
EXAMPLE 6
Suppose that in the fast-food restaurant mentioned previously each customer orders either one hamburger or one fish sandwich. What is the average information content in a customer’s order?
Prepared by: Engr. Jo-Ann C. Viñas 32
VI. OTHER PARAMETERS
Parameter Equation
1. Code Word Length
2. Average Code Word Length
Prepared by: Engr. Jo-Ann C. Viñas 33
VI. OTHER PARAMETERS
Parameter Equation
3. Coding Efficiency
4. Coding Redundancy
Prepared by: Engr. Jo-Ann C. Viñas 34
EXAMPLE
Calculate the coding efficiency in representing the 26 letters of the alphabet using a binary and decimal system
Prepared by: Engr. Jo-Ann C. Viñas 35
SEATWORK
1. Calculate H(x) for a discrete memory-less channel having six symbols with probabilities:
P(A) =1/2
P(B) =1/4
P(C) = 1/8
P(D) = P(E) = 1/20
P(F) = 1/40
Find the amount of information contained in the messages BAD, BED, BEEF, CAB, FACE, BABAE, ABACADA, BEAD, FADE
2. Determine the entropy for the word “YABBBADDDABBBADDDOOOOO”.
Prepared by: Engr. Jo-Ann C. Viñas 36
SEATWORK
2. Suppose a source emits r = 2000 symbols/sec selected from an alphabet size of M = 4 with symbol probability xi listed below. Find information rate.
Prepared by: Engr. Jo-Ann C. Viñas 37
ASSIGNMENT
Answer Problem 9.1 and 9.7(a-b)
Communication Systems Analysis and Design
by Harold P.E. Stern & Samy A. Mahmoud