When is a minimum distance decoder also a maximum likelihood decoder?

1.3k Views Asked by At

It is well known that if we have a binary symmetric channel with crossover probability $\epsilon<0.5$ and we send a word $x$ through it, the most likely word is the one with minimum hamming distance to the received word $y$, i.e. the received word itself. If we send a word from a codebook $C$ and we choose it uniformly at random then our best guess is the codeword closest in hamming distance to the received word:

$\hat x =\arg \min_{x\in C}\|x-y\|_1$

The same argumentation follows with the Gaussian channel but instead of minimizing the hamming distance the minimization is with respect to the $l_2$ distance. Then my question is:

Is it known for what binary input channels a minimum distance decoder is equivalent to a maximum likelihood decoder?

1

There are 1 best solutions below

1
On BEST ANSWER

I doubt that there is a useful general characterization. One starts assuming that the intra-symbols errors and independent, and hence

$$L=\log p( {\bf y} | {\bf x}) = \sum_i \log p(y_i | x_i) $$

In the case of the binary symmetric channel, we have $\log p(y_i | x_i)= a + b \, d_i $ with $d_i = | x_i -y_i|$, $b=2 \epsilon -1 <0 $ and $a=1-\epsilon$. So $L = n a + b \sum d_i$ and it's enough to minimize $\sum d_i =\|{\bf x}-{\bf y}\|_1$

In the case of the gaussian, we have $\log p(y_i | x_i) = s + t \,d_i^2$ with $t<0$, and hence we want to minimize $\|{\bf x}-{\bf y}\|_2$

A (not very revealing, nor very general) generalization would be: if $\log p(y_i | x_i) = g(|x_i-y_i|^k)$ with $g : {\mathbb R}_{\ge 0} \to {\mathbb R}$ monotonous decreasing, then the maximum likelihood estimator minimizes the $k$ distance $\hat x =\arg \min_{x\in C}\|x-y\|_k$