13.07.2015 Views

View File - University of Engineering and Technology, Taxila

View File - University of Engineering and Technology, Taxila

View File - University of Engineering and Technology, Taxila

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

6.1 CHANNEL CODINGNoise within a transmission channel inevitably causes discrepancies or errorsbetween the channel input <strong>and</strong> channel output. A channel is a specific portion<strong>of</strong> the information-carrying capacity <strong>of</strong> a network interface specified by acertain transmission rate. In Chap. 3, Sec. 3.2.5, <strong>and</strong> Chap. 5, Sec. 5.2, theconcepts <strong>of</strong> bit error rate <strong>and</strong> channel capacity using Shannon’s theorem wererespectively discussed in some detail [2].Coding enables the receiver to decode the incoming coded signals notonly to detect errors but also to correct them. Effectively encoding signalsfrom discrete sources <strong>and</strong> approximating a specified Shannon limit areachievable by symbol combination in large blocks <strong>and</strong> by the application <strong>of</strong>special coding methods such as Huffman [3] codes <strong>and</strong> Fano 1 [4] codes. Forclarification, the terms ‘‘data compression’’ <strong>and</strong> ‘‘effective encoding’’ aretreated as synonyms: meaning that redundancy is eliminated from the originalsignal, <strong>and</strong> it is then encoded as economically as possible with a minimumnumber <strong>of</strong> binary symbols [5]. An encoder performs redundancy elimination.The application <strong>of</strong> the effective encoding methods essentially reduces therequired channel transmission rate.In general, if k binary digits enter the channel encoder, <strong>and</strong> the channelencoder outputs n binary digits, then the code rate can be defined asR c ¼ k nð6:1ÞThe channel-coding theorem specifies the capacity C c as a fundamental limiton the rate at which the transmission <strong>of</strong> reliable errorfree message can takeplace over a discrete memoryless channel (DMC). The concept <strong>of</strong> DMC hasbeen discussed in Chap. 3. For example, let a discrete memoryless source withan alphabet X have entropy HðX Þ <strong>and</strong> produce one symbol every T s sec.Supposing that a DMC with capacity C c (bit=symbol) is used once every T csec; then ifHðX ÞT s C cT cð6:2Þthere exists a coding scheme for which the source output can be transmittedover the channel <strong>and</strong> be retrieved <strong>and</strong> reconstructed with an arbitrarily smallprobability <strong>of</strong> error. Conversely, it is impossible to transfer information over1 Also called Shannon–Fado codes.Copyright © 2002 by Marcel Dekker, Inc. All Rights Reserved.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!