13.11.2014 Views

proofs

proofs

proofs

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Communicating without errors Chapter 37<br />

In 1956, Claude Shannon, the founder of information theory, posed the<br />

following very interesting question:<br />

Suppose we want to transmit messages across a channel (where<br />

some symbols may be distorted) to a receiver. What is the maximum<br />

rate of transmission such that the receiver may recover the original<br />

message without errors?<br />

Let us see what Shannon meant by “channel” and “rate of transmission.”<br />

We are given a set V of symbols, and a message is just a string of symbols<br />

from V. We model the channel as a graph G = (V, E), where V is the set<br />

of symbols, and E the set of edges between unreliable pairs of symbols,<br />

that is, symbols which may be confused during transmission. For example,<br />

communicating over a phone in everyday language, we connnect the symbols<br />

B and P by an edge since the receiver may not be able to distinguish<br />

them. Let us call G the confusion graph.<br />

The 5-cycle C 5 will play a prominent role in our discussion. In this example,<br />

1 and 2 may be confused, but not 1 and 3, etc. Ideally we would like<br />

to use all 5 symbols for transmission, but since we want to communicate<br />

error-free we can — if we only send single symbols — use only one letter<br />

from each pair that might be confused. Thus for the 5-cycle we can use<br />

only two different letters (any two that are not connected by an edge). In the<br />

language of information theory, this means that for the 5-cycle we achieve<br />

an information rate of log 2 2 = 1 (instead of the maximal log 2 5 ≈ 2.32).<br />

It is clear that in this model, for an arbitrary graph G = (V, E), the best<br />

we can do is to transmit symbols from a largest independent set. Thus the<br />

information rate, when sending single symbols, is log 2 α(G), where α(G)<br />

is the independence number of G.<br />

Let us see whether we can increase the information rate by using larger<br />

strings in place of single symbols. Suppose we want to transmit strings of<br />

length 2. The strings u 1 u 2 and v 1 v 2 can only be confused if one of the<br />

following three cases holds:<br />

• u 1 = v 1 and u 2 can be confused with v 2 ,<br />

• u 2 = v 2 and u 1 can be confused with v 1 , or<br />

• u 1 ≠ v 1 can be confused and u 2 ≠ v 2 can be confused.<br />

In graph-theoretic terms this amounts to considering the product G 1 × G 2<br />

of two graphs G 1 = (V 1 , E 1 ) and G 2 = (V 2 , E 2 ). G 1 × G 2 has the vertex<br />

Claude Shannon<br />

1<br />

5<br />

4 3<br />

2

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!