Consider the BI-AWGN channel. Let $$ denote a binary codeword, $$ denote the corresponding BPSK signal vector, and $$ denote the received signal vector. Suppose that the transmitted codewords are equally likely. Prove that the maximum a posteriori (MAP) rule can be replaced by the minimum distance rule, i.e
$\bf {\hat v}$=$arg$ $max_\bf{v}$$p(|)=arg$ $max_\bf{ v}$$d_E(|)$,where $d_E(|)=\sqrt{\sum_i(y_i-x_i)^2}$
The optimum receiver, is a receiver that maximizes the maximum a posteriori probability
$$\text{P}(\mathbf{x}/\mathbf{y}) = \frac{\text{P}(\mathbf{y}/\mathbf{x})\,\text{P}(\mathbf{x})}{\text{P}(\mathbf{y})}$$
where the right hand side, follows from Baye's theorem. Since all codewords are equally likely this means that $\text{P}(\mathbf{x})$ is the same for all possible codewords. Note also that $\text{P}(\mathbf{y})$ is common for all possible codewords. So, maximizing MAP $\text{P}(\mathbf{x}/\mathbf{y})$ is equivalent to maximizing the maximum likelihood (ML) function $\text{P}(\mathbf{y}/\mathbf{x})$.
When the channel is additive white Gaussian noise
$$\text{P}(\mathbf{y}/\mathbf{x})=\prod_{n=1}^N\text{P}(y_n/x_n)$$
where $N$ is the number of bits in each codeword. Each conditional probability $\text{P}(y_n/x_n)$ follows a Gaussian distribution with mean $x_n$, and variance equals to the noise variance. So,
$$\text{P}(\mathbf{y}/\mathbf{x}) \propto\prod_{n=1}^N \exp\left(-\frac{(y_n-x_n)^2}{2\sigma_w^2}\right) = \exp\left(-\frac{1}{2\sigma_w^2}\sum_{n=1}^N(y_n-x_n)^2\right)$$
where $\sigma_w^2$ is the noise variance. To maximize the left hand side, we need to minimize the exponent in the right hand side, which means minimizing the summation. This indeed shows that the MAP rule can be replaced by the minimum Euclidean distance between all possible codewords and the received vector.