15.01.2013 Views

U. Glaeser

U. Glaeser

U. Glaeser

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

esponse (this is also valid for different lengths of x and i with a suitable definition of h). If h is of finite<br />

duration, M denotes the system memory length.<br />

Let us now consider a feedforward FSM with memory length M. At any time instant (depth or level) l,<br />

the FSM output x l depends on the current input i l and M previous inputs i l −1,…, i l −M. The overall<br />

functioning of the system can be mapped on a trellis diagram, whereon a node represents one of q M<br />

encoder states (q is the cardinality of the input alphabet including the case when the input symbol is<br />

actually a subsequence), while a branch connecting two nodes represents the FSM output associated to<br />

the transition between the corresponding system states.<br />

A trellis, which is a visualization of the state transition diagram with a time element incorporated, is<br />

characterized by q branches stemming from and entering each state, except in the first and last M branches<br />

(respectively called head and tail of the trellis). The branches at the lth time instant are labeled by<br />

sequences x l ∈ X. A sequence of l information symbols, i [0,l) specifies a path from the root node to a node<br />

at the lth level and, in turn, this path specifies the output sequence x [0,l) = x 0 • x 1 • … • x l −1, where •<br />

denotes concatenation of two sequences.<br />

The input can, but need not, be separated in frames of some length. For framed data, where the length<br />

of each input frame equals L branches (thus L q-ary symbols) the length of the output frame is L + M<br />

branches (L + M output symbols), where the M known symbols (usually all zeros) are added at the end<br />

of the sequence to force the system into the desired terminal state. It is said that such systems suffer a<br />

fractional rate loss by L/(L + M). Clearly, this rate loss has no asymptotic significance.<br />

In the sequel, the detection of the input sequence, i (0,∞), will be analyzed based on the corrupted output<br />

sequence y [0,∞) = x [0,∞) + u [0,∞). Suppose there is no feedback from the output to the input, so that<br />

and<br />

Usually, u (0,∞) is a sequence that represents additive white Gaussian noise sampled and quantized to<br />

enable digital processing.<br />

The task of the detector that minimizes the sequence error probability is to find a sequence which<br />

maximizes the joint probability of input and output channel sequences<br />

Since usually the set of all probabilities P[x [0,L+M)] is equal, it is sufficient to find a procedure that<br />

maximizes P[y [0,L+M)| x [0,L+M)], and a decoder that always chooses as its estimate one of the sequences that<br />

maximize it or<br />

(where A ≥ 0 is a suitably chosen constant, and f(⋅) is any function) is called a maximum-likelihood<br />

decoder (MLD). This quantity is called a metric, µ. This type of metric suffers one significant disadvantage<br />

because it is suited only for comparison between paths of the same length. Some algorithms, however,<br />

© 2002 by CRC Press LLC<br />

P[ ynx0,…,x n−1, xn, y0,…,y n−1]<br />

= P[ ynxn] P[ y1,…,yN x1,…,x N]<br />

= P[ ynxn] N<br />

∏<br />

P[ y [0,L+M), x [0,L+M) ] = P[ y [0,L+M) x [0,L+M) ]P[ x [0,L+M) ]<br />

n=1<br />

µ ( y [0,L+M) x [0,L+M) ) = A log2P[ y [0,L+M) x [0,L+M) ]<br />

L+M<br />

∑<br />

– f( y [0,L+M) ) =<br />

A log2( P[ yl xl] – f( yl) )<br />

l=0

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!