dc
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
Code No: 07A5EC09 :2: Set No.3<br />
10) A source transmitting ‘m’ number of messages is connected to a noise free channel. The<br />
capacity of the channel is [ ]<br />
a) m bits/symbol b) m 2 bits/symbol c) log m bits/symbol d) 2m bits/symbol<br />
II Fill in the blanks:<br />
11) Assuming 26 characters are equally likely , the average of the information content of<br />
English language in bits/character is________________<br />
12) The distance between two vector c1 and c2 is defined as the no.of components in which<br />
they are differ is called as____________________<br />
13) The minimum distance of a linear block code is equal to____________________of any<br />
non-zero code word in the code.<br />
14) A linear block code with a minimum distance d min<br />
can detect upto ___________________<br />
15) For a Linear Block code Code rate =_________________<br />
16) The information rate of a source is also referred to as entropy measured in<br />
______________<br />
17) H(X,Y)=______________ or __________________<br />
18) Capacity of a noise free channel is _________________<br />
19) The Shanon’s limit is ______________<br />
20) The channel capacity with infinite bandwidth is not because ____________<br />
-oOo-<br />
Code No: 07A5EC09 Set No. 4<br />
JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY HYDERABAD<br />
III B.Tech. I Sem., II Mid-Term Examinations, November – 2010<br />
DIGITAL COMMUNICATIONS<br />
Objective Exam<br />
Name: ______________________________ Hall Ticket No.<br />
Answer All Questions. All Questions Carry Equal Marks.Time: 20 Min. Marks: 20.<br />
I Choose the correct alternative:<br />
1) For the data word 1110 in a (7, 4) non-systematic cyclic code with the generator<br />
polynomial 1+x+x 3 , the code polynomial is [ ]<br />
a) 1+x+x 3 +x 5 b) 1+x 2 +x 3 +x 5 c) 1+x 2 +x 3 +x 4 d) 1+x 4 +x 5<br />
2) A source X with entropy 2 bits/message is connected to the receiver Y through a noise free<br />
channel. The conditional probability of the source is H(X/Y) and the joint entropy of<br />
the source and the receiver H(X, Y). Then [ ]<br />
a) H(X,Y)= 2 bits/message b) H(X/Y)= 2 bits/message<br />
c) H(X, Y)= 0 bits/message d) H(X/Y)= 1 bit/message<br />
3) Which of the following is a p(Y/X) matrix for a binary Erasure channel [ ]<br />
a) 11ppqq−⎡⎤⎢⎥−⎣⎦ b) c) d) None<br />
4) The channel matrix of a noiseless channel [ ]<br />
a)consists of a single nonzero number in each column.<br />
b) consists of a single nonzero number in each row.<br />
c) is an Identity Matrix. d) is a square matrix.<br />
5) Information content of a message [ ]<br />
a) increase with its certainty of occurrence. b) independent of the certainty of<br />
occurrence.<br />
c) increases with its uncertainty of occurrence. d) is the logarithm of its uncertainty of<br />
occurrence.<br />
6) The channel capacity of a BSC with the transition probability ½ is [ ]