module-3 - visvesvaraya technological universitynptel.vtu.ac.in/vtu-nmeict/itc/module3.pdf · with...
TRANSCRIPT
Module-3
Page 47 of 92
3.1) NPTEL Video Link Module-3 Lecture Number 20 to 29
Sl. No.
Module No.
Lecture No. Topic Covered Link
1 Mod 03 Lec-20 Introduction to Information Channel (55:48) http://nptel.ac.in/courses/117101053/20
2 Mod 03 Lec-21 Equivocation and Mutual Information (51:36) http://nptel.ac.in/courses/117101053/21
3 Mod 03 Lec-22 Properties of Different Information Channels (54:12) http://nptel.ac.in/courses/117101053/22
4 Mod 03 Lec-23 Reduction of Information Channels (50:49) http://nptel.ac.in/courses/117101053/23
5 Mod 03 Lec-24
Properties of Mutual Information and Introduction to Channel Capacity (51:51) http://nptel.ac.in/courses/117101053/24
6 Mod 03 Lec-25
Calculation of Channel Capacity for Different Information Channel (47:12) http://nptel.ac.in/courses/117101053/25
7 Mod 03 Lec-26 Shannon's Second Theorem (50:22) http://nptel.ac.in/courses/117101053/26
8 Mod 03 Lec-27
Discussion on Error Free Communication Over Noisy Channel (53:24) http://nptel.ac.in/courses/117101053/27
9 Mod 03 Lec-28
Error Free Communication Over a Binary Symmetric Channel (50:03) http://nptel.ac.in/courses/117101053/28
10 Mod 03 Lec-29
Differential Entropy and Evaluation of Mutual Information (55:53) http://nptel.ac.in/courses/117101053/29
Page 48 of 92
3.1) Questions
Sl. No.
Questions Video Number
Time in Minutes
1 State and discuss Shannon’s second theorem. 20 2 Define Information Channel. 20 2 3 What is zero memory information channel. 20 5 4 What is Binary symmetric Channel [BSC]? What is its role in modern digital
communication? 20 7
5 Write the channel diagram of a Binary symmetric Channel [BSC]. 20 9 6 What is stochastic channel model? 20 14 7 Write the channel matrix of a Binary symmetric Channel 20 15 8 Define nth Extension of the channel. 20 189 Write the channel matrix of nth Extension of the channel. 20 19
10 What is the significance of channel extension? Write the channel matrix of 2nd Extension of the Binary symmetric Channel
20 23
11 What is nth Kronecker power of a channel matrix. 20 25 12 What is the function of information channel? 20 26 13 What are backward and forward probabilities In a channel? 20 35 14 In a Binary Channel the channel matrix is given by
With p{a=0}=3/4 and p{a=1}=1/4
i) Write the noise diagram ii) Find the probabilities of output symbols iii) Also find the backward probabilities
20 40
15 Define a priori and a posteriori probability of a symbol 20 48 16 Define a priori and a posteriori entropies of input source 20 49 17 In a Binary Channel the channel matrix is given by
With p{a=0}=3/4 and p{a=1}=1/4 Find priori and a posteriori entropies of input source.
20 53
18 What is the most efficient method of coding from a source? 21 5 19 What is Equivocation? What is its significance in channel modeling? 21 20 20 Define the following entropies
i) H(A) ii) H(B) iii) H(A,B) iv) H(A/B) and v) H(B/A)
21 36
21 Define Mutual information and obtain an expression for the same. 21 23 22 Define average codeword length of the channel and obtain an
expression for upper and lower bound for the same. 21 11
Page 49 of 92
23 Define average codeword length for nth extension of the channel and
obtain an expression for upper and lower bound for the same 21 14
24 Define Shannon first theorem and derive an expression for the same as applicable to channels.
21 17
25 Prove that I(A:B)=H(A)-H(A/B) 21 26 26 Show that 21 30
27 Show that I(A;B) 21 34 28 Show that mutual information I (A;B) = I(B:A) 21 36 29 Prove that H(A:B)=H(A)+H(B) - I (A;B) 21 41 30 Write the Venn diagram of channel entropies 21 42 31 Prove the following identities
i) H(A:B)=H(A)+H(B/A) ii) H(A:B)=H(B)+H(A/B)
21 44
32 In a Binary symmetric Channel the channel matrix is given by
With p{a=0}=w and p{a=1}= Derive an expression for mutual information
21 47
33 Show that mutual information of a Binary symmetric Channel is given by
Where p is probability of error in reception and w is probability of transmission of symbol 0. Also plot entropy function.
22 & 21
2 & 45
34 Show that in a Binary symmetric Channel
Where p is probability of error in reception and w is probability of transmission of symbol 0.
22
35 Define noiseless channel and give one example for the same. 22 6 36 The channel matrix is given by
Write the channel diagram
22 8
37 Define deterministic channel and give one example for the same. 22 11 38 Obtain an expression for mutual information in case of noiseless channel. 22 1639 Show that in case of noiseless channel I(A:B)=H(A) 22 18 40 Show that in case of deterministic channel I(A:B)=H(B) 22 19 41 The channel matrix is given by
Write the channel diagram
22 19
42 Obtain an expression for mutual information in case of deterministic channel.
22 18
Page 50 of 92
43 Show that in case of deterministic channel I(A:B)=H(B) 22 20 44 Show that in a cascaded channel
i) p(ck/bj,ai)=p(ck/bj) for all I,j,k ii) p(ai/bj,ck) = p(ai/bj)
22 24
45 Show that in a cascaded channels H(A/C) H(B/A)
22 30
46 Show that in a cascaded channels I(A;B) I(A;C)
22 34
47 Show that in a noiseless channel P(a/b,c)=P(a/c)
22 37
48 Show that when two binary symmetric channels are connected in cascade
Where p is probability of error in reception
22 46
49 Show that when three binary symmetric channels are connected in cascade
Where p is probability of error in reception
22 48
50 Show that when three binary symmetric channels are connected in cascade I(A;B) I(A;C) I(A;B)
22 50
51 What is elementary reduction of channel matrix? 23 6 52 What is reduction of channel matrix? What is its significance? 23 7 53 In a Binary symmetric Channel the channel matrix is given by
i) Write the channel matrix for 2nd extension and ii) Write the reduced matrix for 2nd extension
23 8
54 What is reduced channel? How to model reduced channel in terms of deterministic channel and channel matrix P.
23 12
55 When the mutual information of a reduced channel is equal to that of original channel? Briefly discuss.
23 13
56 Derive the condition such that the mutual information of a reduced channel is equal to that of original channel?
23 23
57 What is sufficient reduction? Illustrate with an example. 23 26 58 The channel matrix is given by
Write the sufficient reduction matrix.
23 26
59 Briefly discuss the additive property of mutual information when channels are connected in cascade.
23 35
60 When channels are connected in cascade show that I(A;B,C)=I(A;C)+I(A;B/C)
23 42
61 When channels are connected in cascade prove that i) I(A;B,C)=H(A)-H(A/B,C)
23 40
Page 51 of 92
ii) I(A;B,C)=H(B,C)-H(B,C/A) iii) I(A;B,C)=I(A;C)+I(A;B/C)
62 in a Binary symmetric Channel the channel matrix is given by
With p{a=0}=1/2 and p{a=1}=1/2 p is probability of error in reception. Find
I) The probabilities of a repetitive BSC II) I(A;B,C)
24 2
63 in a Binary symmetric Channel the channel matrix is given by
With p{a=0}=1/2 and p{a=1}=1/2 p is probability of error in reception. When two channels are cascaded Show that
24 4
64 in a Binary symmetric Channel the channel matrix is given by
With p{a=0}=1/2 and p{a=1}=1/2 p is probability of error in reception. When three channels are cascaded Show that
24 15
65 Plot the graph of mutual information of a BSC with n repetition, for n=0,1,2 and comment on the result.
24 21
66 Define mutual information of more than two alphabets 24 24 67 For mutual information of more than two alphabets show that
I(A;B;C)=I(A;B)-I(A:B/C) 24 25
68 For mutual information of more than two alphabets show that I(A;B;C)=H(A)+H(B)+H(C)-H(A,B)-H(A,C)-H(B,C)+H(A,B,C)
24 27
69 Is mutual information I(A;B;C) can be negative? Illustrate with an example. 24 35 70 What is channel capacity? What is its significance in communication? 24 37 71 Define uniform channel. What is its significance in communication? 24 43 72 Write the r-ary channel matrix of rSC channel 24 45 73 Obtain an expression for channel capacity of uniform channel 24 4874 Derive an expression for channel capacity of rSC channel 24 49 75 Define weakly symmetric channel, illustrate with an example. 25 2 76 Derive an expression for channel capacity of weakly symmetric channel 25 5 77 Obtain an expression for channel capacity of noiseless channel 25 10 78 What is Binary Erasure Channel? What is its significance in
communication? 25 11
79 Write the channel matrix of Binary Erasure Channel. 25 12 80 Obtain an expression for channel capacity of Binary Erasure Channel 25 13 81 Show that in Binary Erasure Channel the channel capacity is given by
C= (1-p), where p is probability of erase. 25 16
Page 52 of 92
82 Define Z- channel, illustrate with an example. 25 20 83 Write the channel matrix and channel diagram of Z- Channel. 25 21 84 Obtain an expression for channel capacity of Z- Channel 25 22 85 For binary asymmetric channel obtain an expression for channel capacity. 25 28 86 The channel matrix is given by
With p(a1)= and p(a2)=1- .Write the channel diagram and find the channel capacity.
25 37
87 What is decision rule in a channel? Explain with suitable example. 26 7 88 What are the different types of decision rules in a channel? 26 8 89 Define probability of error in a channel. How to minimize this error? 26 12 90 What is conditional maximum likelihood decision rule? 26 15 91 Write the maximum likelihood decision rule for the matrix given below
Also find the probability of error PE.
26 19
92 State and prove Fano’s inequality as applicable to information theory. 26 41 93 With suitable graph explain the inequalities H(A/B) H(PE )+ PE log(r-1) 26 44 94 Explain the working of single parity check code with suitable example. 27 6 95 What are redundant codes? How in redundant codes by increasing
redundant bits the probability of error will decrease, Illustrate with suitable examples
27 10
96 With suitable plot explain the exchange of rate for reliability in a BSC. 27 25 97 Define Hamming distance of a code and What is its significance in coding
theory. 27 27
98 Explain 3 dimensional Hamming cube and its role while decoding the code. 27 28 99 With suitable figure explain the maximum-likelihood decision rule for the
BSC. 27 33
100 Explain the decoding procedure of 5-repetation code. 27 35 101 How redundant bits or check bits are added in practical system? Explain
with suitable example. 27 43
102 Define occupancy factor and what is its significance? 27 51103 What must be the rate reduction ratio / in order to achieve error free
reception? 28 2
104 If a source emits digits per second over T seconds, then how many super messages are possible?
28 13
105 Explain the decoding procedure of received data with proper decision rule. 28 17 106 In decoding procedure obtain an express for the probability of choosing
correct vertex and the probability of choosing wrong vertex28 26
107 Show that in a Binary symmetric channel the ratio / should be less than channel capacity in order to achieve error free reception.
28 28
108 State and explain Hartley Shannon Law. 28 32 109 Define entropy of continuous random variable. 28 35 110 What is reference entropy? How it defers from absolute entropy? Explain 28 42
Page 53 of 92
with suitable example.
111 What are the constraints used to maximize differential entropy? 29 4 112 Derive an expression for the maximum value of differential entropy of
continuous source with Gaussian distribution and mean square error 2 29 6
113 Derive an expression for the maximum value of differential entropy of continuous source with uniform distribution function.
29 22
114 For the pdf given
P(x)=
Derive an expression for the maximum value of differential entropy of continuous source
29 24
115 Derive an expression for entropy of a band limited white Gaussian noise. 29 33 116 Derive an expression for the amount of information transmitted over the
continuous channel. 29 46
117 In the case of continuous channel show that I(X;Y)=H(X)-H(X/Y)
29 48
Page 54 of 92
3.2) Quiz
Sl. No.
Questions Answer
1 In a channel matrix each row corresponds to ________ Input of channel 2 In a channel matrix each column represents ________ Output of channel 3 The channel equivocation is given by _________ H(A/B) 4 In a channel matrix adding all the probabilities in a row
gives ___ 1
5 For nth extension of the channel mutual information is given by I(An:Bn)___
n I(A;B)
6 In a channel matrix with one and only one nonzero element in each column is ______________ channel
Noiseless
7 In a channel matrix with one and only one nonzero element in each row is ______________ channel
Deterministic
8 In case of deterministic channel I(A:B)=_________ H(B) 9 When channels are cascaded over all information comes
out is always __________than information comes out by each channel
Less
10 Maximum value of I(A;B) gives _______ Channel capacity 11 Uniform channel is also known as ________channel Symmetric 12 In a noiseless channel, the channel capacity is given by
_____ Log
13 A channel with r input symbols and s output symbols will have ______ different possible decision rules.
rs
14 By increasing energy of the signal we can __________ the probability of error.
Reduce
15 In coding theory as T the probability of error tend s to________
Zero
16 According to Shannon theorem for error free transmission the ratio / should be ______ than Cs [channel capacity]
less
17 For a Binary symmetric channel the channel capacity Cs should be_____ 1.
18 Relative entropy of continuous source is given by__________
19 The entropy of continuous source with Gaussian distribution and variance 2 is given by_____________
H(X)= 2)
20 The entropy of continuous source with uniform distribution is given by_____________
H(X)=log2M
Page 55 of 92
3.4) True or False
True or False
1 All information channels should be of zero memory T/F F 2 The channel matrix is a conditional probability matrix T/F T 3 In a noise matrix 1 T/F T 4 A Binary symmetric Channel should have only two input symbols T/F T 5 In a channel matrix adding all the probabilities in a column gives 1 T/F F 6 In a system, if output symbols probabilities and channel matrix are known
then we can find input symbols probabilities T/F F
7 In a system, if input symbols probabilities and channel matrix are known then we can find output symbols probabilities
T/F T
8 Compact code for one set of statistics will not be in general a Compact code for other set of statistics
T/F T
9 A sequence of code words from a known sequence of uniquely decodable code is uniquely decodable
T/F F
10 Mutual information can be negative also T/F F 11 If event A and B are statistically independent then mutual information
I(A:B)=0 T/F T
12 A Binary symmetric Channel with probability of error p=1, is a noiseless channel.
T/F T
13 In case of noiseless channel I(A:B)=H(B) T/F F 14 When channels are cascaded over all information comes out is always
greater than information comes out by each channel T/F F
15 The information will always leak in the channel. T/F T 16 The mutual information of a reduced channel is equal to that of original
channel when p(a/bi)=p(a/bj) for all a T/F T
17 Minimum value of I(A;B) gives channel capacity T/F F 18 Channel capacity is a function input symbol probabilities. T/F F 19 In case of uniform channel all the elements in the first row will repeat in
second and subsequent rows but not in the same order. T/F T
20 Uniform channel is also known as Asymmetric channel T/F F 21 The channel matrix given by
Is weekly symmetric channel
T/F T
22 In a weekly symmetric channel every row is permutation of other rows and all column sum is same
T/F T
23 In case of deterministic channel I(A;B)=H(B) T/F T
Page 56 of 92
24 In Binary Erasure Channel the channel capacity is given by C= p T/F F 25 In Z- channel both the input symbols are received without error. T/F F26 Shannon’s second theorem deals with amount of error free information we
can get through the channel. T/F T
27 The conditional maximum likelihood decision rule depends upon on the apriority probabilities.
T/F T
28 As long as channel noise exists in a channel we can have error free communication
T/F F
29 For a given signal power energy can be increased by reducing the rate of transmission.
T/F T
30 By increasing the rate of transmission we can reduce the probability of error.
T/F F
31 We can have error free communication by adding sufficient redundancy in the code.
T/F T
32 In practice probability of error can be made small, as long as rate of transmission is greater than channel capacity.
T/F F
33 If redundancy increases bandwidth required for transmission will also increases
T/F T
34 In coding theory as T the occupancy factor tends to zero. T/F T35 In coding theory as T we can have error free transmission. T/F T 36 The ratio / should be greater than channel capacity in order to achieve
error free reception. T/F F
37 In Ergodic process the time average and the ensemble average differs T/F F
Page 57 of 92
3.5) FAQ
Sl. No.
FAQ Video Number
Time in Minutes
1 Define Information Channel. 20 2 2 What is Binary symmetric Channel [BSC]? What is its role in modern digital
communication? 20 7
3 What is the significance of channel extension? Write the channel matrix of 2nd Extension of the Binary symmetric Channel
20 23
4 What is the function of information channel in communication system? 20 26 5 Define Mutual information and obtain an expression for the same. 21 23 6 Define average codeword length of the channel and obtain an expression
for upper and lower bound for the same. 21 11
7 Define average codeword length for nth extension of the channel and obtain an expression for upper and lower bound for the same
21 14
8 Define Shannon first theorem and derive an expression for the same as applicable to channels.
21 17
9 Write the Venn diagram of channel entropies 21 42 10 Define noiseless channel and give one example for the same. 22 6 11 Define deterministic channel and give one example for the same. 22 11 12 Obtain an expression for mutual information in case of noiseless channel. 22 16 13 Show that in case of noiseless channel I(A:B)=H(A) 22 18 14 Show that in case of deterministic channel I(A:B)=H(B) 22 19 15 Show that when two binary symmetric channels are connected in cascade
Where p is probability of error in reception
22 46
16 What is reduction of channel matrix? What is its significance? 23 7 17 Derive the condition such that the mutual information of a reduced
channel is equal to that of original channel? 23 23
18 Briefly discuss the additive property of mutual information when channels are connected in cascade.
23 35
19 What is channel capacity? What is its significance in communication? 24 37 20 Define uniform channel. What is its significance in communication? 24 43 21 What is Binary Erasure Channel? What is its significance in
communication? 25 11
22 Write the channel matrix of Binary Erasure Channel. 25 12 23 Obtain an expression for channel capacity of Binary Erasure Channel 25 13 24 Show that in Binary Erasure Channel the channel capacity is given by
C= (1-p), where p is probability of erase. 25 16
25 Define Z- channel. illustrate with an example. 25 20 26 What is conditional maximum likelihood decision rule? 26 15
Page 58 of 92
27 State and prove Fanon’s inequality as applicable to information theory. 26 41 28 With suitable figure explain the maximum-likelihood decision rule for the
BSC. 27 33
29 Define occupancy factor and what is its significance? 27 51 30 State and explain Hartley Shannon Law 28 3231 Define entropy of continuous random variable. 28 35
Page 59 of 92
3.6) Assignment Questions
Sl. No.
Questions
1 In a Binary Channel the channel matrix is given by
With p{a=0}=0.65 and p{a=1}=0.35
iv) Write the noise diagram v) Find the probabilities of output symbols vi) Also find the backward probabilities
2 In a Binary Channel the channel matrix is given by
With p{a=0}=3/4 and p{a=1}=1/4 Find priori and a posteriori entropies of input source.
3 What is the most efficient method of coding from a source? 4 The channel matrix is given by
Write the channel diagram
5 The channel matrix is given by
Write the channel diagram
6 The channel matrix is given by
Write the sufficient reduction matrix.
7 in a Binary symmetric Channel the channel matrix is given by
With p{a=0}=0.6 and p{a=1}=0.4 p is probability of error in reception. Find
III) The probabilities of a repetitive BSC IV) I(A;B,C)
Page 60 of 92
8 The channel matrix is given by
With p(a1)= and p(a2)=1- .Write the channel diagram and find the channel capacity.
9 Write the maximum likelihood decision rule for the matrix given below
Also find the probability of error PE.
10 The channel matrix is given by
With p(a1)=0.55 and p(a2)=0.45.Write the channel diagram and find the channel capacity.
Page 61 of 92
3.7) Additional Links
Additional Links
http://www.youtube.com/watch?v=C-o2jcLFxyk&list=PLWMqMAYxtBM-IeOSmNkT-KEcgru8EkzCs
http://www.youtube.com/watch?v=R4OlXb9aTvQ http://www.youtube.com/watch?v=JnJq3Py0dyM http://www.yovisto.com/video/20224 http://www.youtube.com/watch?v=UrefKMSEuAI&list=PLE125425EC837021F http://elearning.vtu.ac.in/EC63.html http://www.cs.toronto.edu/~mackay/itprnn/book.pdf http://people.irisa.fr/Olivier.Le_Meur/teaching/InformationTheory_DIIC3_INC.pdf http://clem.dii.unisi.it/~vipp/files/TIC/dispense.pdf http://www-public.it-sudparis.eu/~uro/cours-pdf/poly.pdf
Sl. No.
Topic Web Links
1 Introduction to information channel
http://www.exp-math.uni-essen.de/~vinck/information%20theory/lecture%202013%20info%20theory/chapter%204%20channel-coding%20BW.pdf
http://www.stanford.edu/~montanar/RESEARCH/BOOK/partA.pdf
http://poincare.matf.bg.ac.rs/nastavno/viktor/Channel_Capacity.pdf#page=1&zoom=auto,0,654
http://www-public.it-sudparis.eu/~uro/cours-pdf/poly.pdf
2 Equivocation and mutual information
http://www.ece.uvic.ca/~agullive%20/joint.pdf
http://skynet.ee.ic.ac.uk/notes/CS_2011_3_comm_channels.pdf
http://www2.tu-ilmenau.de/nt/de/teachings/vorlesungen/itsc_master/folien/script.pdf
http://www-public.it-sudparis.eu/~uro/cours-pdf/poly.pdf
3 Properties of different information
http://paginas.fe.up.pt/~vinhoza/itpa/lecture3.pdf
http://www2.maths.lth.se/media/thesis/2012/h
https://www.ti.rwth-aachen.de/teac
http://people.csail.mit.edu/madhu/FT02/
Page 62 of 92
channels ampus-
wessman-MATX01.pdf
hing/ti/data/save_dir/ti1/WS1011/chap2_handouts.pdf
scribe/lect02.pdf
4 Reduction of information channel
http://www.cims.nyu.edu/~chou/notes/infotheory.pdf
http://clem.dii.unisi.it/~vipp/files/TIC/dispense.pdf
http://www-public.it-sudparis.eu/~uro/cours-pdf/poly.pdf
5 Average codeword length
http://math.ntnu.edu.tw/~li/note/Code.pdf
http://nptel.ac.in/courses/Webcourse-contents/IIT%20Kharagpur/Multimedia%20Processing/pdf/ssg_m2l3.pdf
http://people.csail.mit.edu/madhu/FT02/scribe/lect02.pdf
https://www.ti.rwth-aachen.de/teaching/ti/data/save_dir/ti1/WS1011/chap2_handouts.pdf
6 nth extension of the channel
http://clem.dii.unisi.it/~vipp/files/TIC/dispense.pdf
http://people.irisa.fr/Olivier.Le_Meur/teaching/InformationTheory_DIIC3_INC.pdf
http://math.ntnu.edu.tw/~li/note/Code.pdf
https://www.ti.rwth-aachen.de/teaching/ti/data/save_dir/ti1/WS1011/chap2_handouts.pdf
7 Shannon first theorem
http://chamilo2.grenet.fr/inp/courses/PHELMAA3SIC5PMSCSF0/document/M2R_SIPT/Info_Th_ChI_II_III.pdf
http://www.cims.nyu.edu/~chou/notes/infotheory.pdf
http://people.irisa.fr/Olivier.Le_Meur/teaching/InformationTheory_DIIC3_INC.pdf
http://www.math.uchicago.edu/~may/VIGRE/VIGRE2008/REUPapers/Biswas.pdf
8 Venn diagram of channel entropies
http://clem.dii.unisi.it/~vipp/files/TIC/dispense.pdf
http://people.irisa.fr/Olivier.Le_Meur/teaching/InformationTheory_DIIC3_INC.pdf
https://www.cs.uic.edu/pub/ECE534/WebHome/ch2.pdf
https://www.cs.princeton.edu/picasso/mats/intro-to-info_jp.pdf
9 Noiseless channel
http://clem.dii.unisi.it/~vipp/files/TIC/dispense.pdf
http://chamilo2.grenet.fr/inp/courses/PHELMAA3SIC5PMSCSF0/document/M2R_SIPT/Info_Th_ChI_II_III.pdf
http://www-public.it-sudparis.eu/~uro/cours-pdf/poly.pdf
http://web.ntpu.edu.tw/~phwang/teaching/2012s/IT/slides/chap07.pdf
10 Channel capacity http://clem.dii.unisi.it/~vipp/files/TIC/dispense.pdf
http://chamilo2.grenet.fr/inp/courses/PHELMAA
http://www.icg.isy.liu.se/courses/infotheory/lect
http://poincare.matf.bg.ac.rs/nastavno/vi
Page 63 of 92
3SIC5PMSCSF0/document/M2R_SIPT/Info_Th_ChI_II_III.pdf
5.pdf ktor/Channel_Capacity.pdf
11 Uniform channel http://poincare.matf.bg.ac.rs/nastavno/viktor/Channel_Capacity.pdf
http://chamilo2.grenet.fr/inp/courses/PHELMAA3SIC5PMSCSF0/document/M2R_SIPT/Info_Th_ChI_II_III.pdf
http://people.irisa.fr/Olivier.Le_Meur/teaching/InformationTheory_DIIC3_INC.pdf
12 Binary Erasure Channel
https://www.ti.rwth-aachen.de/teaching/ti/data/save_dir/ti1/WS1011/chap3_handouts.pdf
http://poincare.matf.bg.ac.rs/nastavno/viktor/Channel_Capacity.pdf
http://people.irisa.fr/Olivier.Le_Meur/teaching/InformationTheory_DIIC3_INC.pdf
http://www.inf.ed.ac.uk/teaching/courses/it/2012/week6.pdf
Page 64 of 92
3.8) Test your skill
Sl. No.
Questions
1 Determine the rate of transmission of information through a channel whose noise characteristics is as shown figure. Given P(x1) = P(x2) = 1/2. Assume rs= 20,000 symbols/sec.
2 In a Binary Channel the channel matrix is given by
With p{a=0}=0.55 and p{a=1}=0.45
i) Write the noise diagram ii) Find the probabilities of output symbols
Also find the backward probabilities 3 For the given channel matrix, compute the mutual information I(X,Y) with P(x1) = 0.45 and
P(x2) =0.55
1y 2y 3y
6/50
6/13/1
03/2
)/(2
1
XX
XYP
4 In a Binary symmetric Channel the channel matrix is given by
i) Write the channel matrix for 2nd and 3rd extension and ii) Write the reduced matrix for 2nd extension
5 A transmitter transmits five symbols with probabilities 0.2, 0.3, 0.2, 0.1 and 0.2. Given the channel matrix P(B/A), calculate
i) H(A),H(B),H(A/B),H(B/A) ii) H(A, B) and I(A,B)
P(B/A) =
03/2
13/1
00
00
03/23/10004/34/10001
Page 65 of 92
6 Determine the capacity of the channel shown in figure
7 In a communication system, a transmitter has 3 input symbols A = {a1, a2, a3} and receiver also has 3 output symbols B ={b1, b2, b3}. The matrix given below shows JPM with some marginal probabilities:
i) Find the missing probabilities (*) in the table. ii) Find P(b3/a1) and P(a1/b3) iii) Are the events a1 and b1 statistically independent? Why?
Page 66 of 92