dc
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
. Self Information<br />
c. Logarithmic measure for information<br />
56. Write short notes on the following<br />
a. Entropy<br />
b. Conditional entropy<br />
c. Mutual Information<br />
d. Information<br />
57. A DMS has an alphabet of eight letters, Xi , i=1,2,3,….,8, with probabilities<br />
0.36,0.14,0.13,0.12,0.1,0.09,0.04,0.02.<br />
i. Use the Huffman encoding procedure to determine a binary code for the<br />
source output.<br />
ii. Determine the entropy of the source and find the efficiency of the code<br />
58. A DMS has an alphabet of eight letters, Xi , i=1,2,3,….,8, with probabilities<br />
{0.05,0.1,0.1,0.15,0.05,0.25,0.3}<br />
i. Use the Shannon-fano coding procedure to determine a binary code for the<br />
source output.<br />
ii. Determine the entropy of the source and find the efficiency of the code. An<br />
analog signal band limited to 10HKz quantize is 8levels of PCM System with<br />
59. Probability of 1/4/1/5, 1/5, 1/10,1/20,1/20, and 1/20 respectively. Find the entropy and rate<br />
of information.<br />
60. Explain various methods for describing conventional methods.<br />
61. Explain about block codes in which each block of k message bits encoded into<br />
block of n>k bits with an example.<br />
62. Consider a (6,3) generator matrix<br />
1 0 0 0 1 1<br />
G = 0 1 0 1 0 1<br />
0 0 1 1 1 0<br />
Find<br />
a) All the code vectors of this code.<br />
b) The parity check matrix for this code.<br />
c) The minimum weight of the code.<br />
63 . What is the hardware components required to implement a cyclic code encoder.<br />
64. Explain about the syndrome calculation, error correction and error detection in<br />
(n, k) cyclic codes.<br />
65. Briefly discuss about the linear block code error control technique.