Decoding Methods - Maximum Likelihood Decoding

Further information: Maximum likelihood

Given a received codeword maximum likelihood decoding picks a codeword to maximize:

i.e. choose the codeword that maximizes the probability that was received, given that was sent. Note that if all codewords are equally likely to be sent then this scheme is equivalent to ideal observer decoding. In fact, by Bayes Theorem we have


\begin{align}
\mathbb{P}(x \mbox{ received} \mid y \mbox{ sent}) & {} = \frac{ \mathbb{P}(x \mbox{ received}, y \mbox{ sent}) }{\mathbb{P}(y \mbox{ sent} )} \\
& {} = \mathbb{P}(y \mbox{ sent} \mid x \mbox{ received}) \cdot \frac{\mathbb{P}(x \mbox{ received})}{\mathbb{P}(y \mbox{ sent})}.
\end{align}

Upon fixing, is restructured and is constant as all codewords are equally likely to be sent. Therefore 
\mathbb{P}(x \mbox{ received} \mid y \mbox{ sent})
is maximised as a function of the variable precisely when 
\mathbb{P}(y \mbox{ sent}\mid x \mbox{ received} )
is maximised, and the claim follows.

As with ideal observer decoding, a convention must be agreed to for non-unique decoding.

The ML decoding problem can also be modeled as an integer programming problem.

The ML decoding algorithm has been found to be an instance of the MPF problem which is solved by applying the generalized distributive law.

Read more about this topic:  Decoding Methods

Famous quotes containing the words maximum and/or likelihood:

    Only at his maximum does an individual surpass all his derivative elements, and become purely himself. And most people never get there. In his own pure individuality a man surpasses his father and mother, and is utterly unknown to them.
    —D.H. (David Herbert)

    What likelihood is there of corrupting a man who has no ambition?
    Samuel Richardson (1689–1761)