19.06.2017 Views

dc

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

3. .Under error free reception, the syndrome vector computed for the received cyclic code<br />

word consists of<br />

(a) alternate 1‘s and 0‘s starting with a 1<br />

(b) all ones<br />

(c) all zeros<br />

(d) alternate 0‘s and 1‘s starting with a 0<br />

4. Source 1 is transmitting two messages with probabilities 0.2 and 0.8 and Source2 is<br />

transmitting two messages with Probabilities 0.5 and 0.5. Then<br />

(a) MaximumuncertaintyisassociatedwithSource1<br />

(b) Boththesources1and2arehavingmaximumamountofuncertaintyassociated<br />

(c) There is no uncertainty associated with either of the two sources.<br />

(d) MaximumuncertaintyisassociatedwithSource2<br />

5. The Hamming Weight of the(6,3) Linear Block coded word 101011<br />

(a) 5<br />

(b) 4<br />

(c) 2<br />

(d) 3<br />

6. If X is the transmitter and Y is the receiver and if the channel is the noise free, then the<br />

mutual information I(X,Y) is equal to<br />

(a) Conditional Entropy of the receiver, given the source<br />

(b) Conditional Entropy of the source, given the receiver<br />

(c) Entropy of the source<br />

(d) Joint entropy of the source and receiver<br />

7. In a Linear Block code<br />

(a) the received power varies linearly with that of the transmitted power<br />

(b) parity bits of the code word are the linear combination of the message bits<br />

(c) the communication channel is a linear system<br />

(d) the encoder satisfies superposition principle

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!