19.06.2017 Views

dc

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

a) increase with its certainty of occurrence. b) independent of the certainty of<br />

occurrence.<br />

c) increases with its uncertainty of occurrence. d) is the logarithm of its uncertainty of<br />

occurrence.<br />

3) The channel capacity of a BSC with the transition probability ½ is [ ]<br />

a) 0 bits b) 1 bit c) 2 bits d) infinity<br />

4) For the data word 1110 in a (7, 4) non-systematic cyclic code with the generator<br />

polynomial 1+x 2 +x 3 , the code polynomial is [ ]<br />

a) 1+x+x 3 +x 5 b) 1+x 2 +x 3 +x 5 c) 1+x 2 +x 3 +x 4 d) 1+x+x 5<br />

5) A source transmitting ‘m’ number of messages is connected to a noise free channel. The<br />

capacity of the channel is [ ]<br />

a) m bits/symbol b) m 2 bits/symbol c) logm bits/symbol d) 2m bits/symbol<br />

6) Which of the following is a p(Y/X) matrix for a binary symmetric channel [ ]<br />

a) b) c) d) None<br />

7) Exchange between channel bandwidth and (S/N) ratio can be adjusted based on [ ]<br />

a)Shannon’s limit b)Shannon’s source coding<br />

c) Shannon’s channel coding d) Shannon Hartley theorem<br />

8) For the data word 1110 in a (7, 4) non-systematic cyclic code with the generator<br />

polynomial 1+x+x 3 , the code polynomial is [ ]<br />

a) 1+x+x 3 +x 5 b) 1+x 2 +x 3 +x 5 c) 1+x 2 +x 3 +x 4 d) 1+x 4 +x 5<br />

9) A source X with entropy 2 bits/message is connected to the receiver Y through a noise free<br />

channel. The conditional probability of the source is H(X/Y) and the joint entropy of<br />

the source and the receiver H(X, Y). Then [ ]<br />

a) H(X,Y)= 2 bits/message b) H(X/Y)= 2 bits/message<br />

c) H(X, Y)= 0 bits/message d) H(X/Y)= 1 bit/message<br />

Cont…..2<br />

Code No: 07A5EC09 :2: Set No.1<br />

10) Which of the following is a p(Y/X) matrix for a binary Erasure channel [ ]<br />

a) 11ppqq−⎡⎤⎢⎥−⎣⎦ b) c) d) None<br />

II Fill in the blanks:<br />

11) The information rate of a source is also referred to as entropy measured in<br />

______________<br />

12) H(X,Y)=______________ or __________________<br />

13) Capacity of a noise free channel is _________________<br />

14) The Shannon’s limit is ______________<br />

15) The channel capacity with infinite bandwidth is not because ____________<br />

16) Assuming 26 characters are equally likely , the average of the information content of<br />

English language in bits/character is________________<br />

17) The distance between two vector c1 and c2 is defined as the no.of components in which<br />

they are differ is called as____________________<br />

18) The minimum distance of a linear block code is equal to____________________of any<br />

non-zero code word in the code.<br />

19) A linear block code with a minimum distance d min<br />

can detect upto ___________________<br />

20) For a Linear Block code Code rate =_________________<br />

-oOo-

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!