Channel Decoding: Gaussian Channel as Time-Varying Binary Symmetric

225 Views Asked by At

I am reading Information Theory, Inference and Learning Algorithms by David MacKay. It is available free of charge online: http://www.inference.org.uk/mackay/itila/book.html (official site, not pirated copy).

My question is to solve Exercise 25.1. It is marked as one of the easier ones. I think I'm just missing some fundamental understanding. Assistance would be enormously appreciated!

I quote the desired section, slightly paraphrased for abbreviation. It is from Section 25.1 on Page 324 (Page 336 of my pdf).


A codeword $t$ is selected from a code with codewords which form a subset of $\{0,1\}^N$ and transmitted over a noisy channel and $y$ is received. We have $$ P(t \mid y) = P(y \mid t) P(t) / P(y) $$ by Bayes's theorem. Assume that he channel is memoryless. Thus $$ P(y \mid t) = \prod_{n=1}^N P(y_n \mid t_n). $$

For example, if the channel is a Gaussian channel with transmissions $\pm x$ and additive noise of standard deviation $\sigma$, then the probability density of the received signal $y_n$ in the two cases $t_n \in \{0,1\}$ are given as follows: $$ \begin{aligned} P(y_n \mid t_n = 1) &= (2 \pi \sigma^2)^{-1/2} \exp\bigl( - (y_n - x)^2 / (2 \sigma^2) \bigr); \\ P(y_n \mid t_n = 0) &= (2 \pi \sigma^2)^{-1/2} \exp\bigl( - (y_n + x)^2 / (2 \sigma^2) \bigr). \end{aligned} $$ [Here we are interpreting $t_n = 0$ as "send $-x$" and $t_n = 1$ as "send $+x$" (plus noise).] From the point of view of decoding, all that matters is the likelihood ratio: $$ \frac {P(y_n \mid t_n = 1)} {P(y_n \mid t_n = 0)} = \exp\biggl( \frac{2 x y_n}{\sigma^2} \biggr). $$

Exercise. Show that from the point of view of decoding a Gaussian channel is equivalent to a time-varying binary symmetric channel with a known noise level $f_n$ which depends on $n$.

For further reference, Section 11.2 (Page 179 of book and 191 of my pdf) discusses inferring the input from a Gaussian channel. It's not so helpful for this, though. I do not see what is time-varying about this.


PS Multiple different SE networks seemed appropriate for this: maths, cstheory, stats. I picked this one but if a mod wants to move it, no objection from me.

2

There are 2 best solutions below

0
On

Arash presents one possibility. Here is another. It gives a completely different conclusion. MacKay's text appears ambiguous to which is intended.


Although it looks like a variable, here $y = (y_n)_{n \in [N]}$ is a single given realisation---the expression $$ \frac {P(y_n \mid t_n = 1)} {P(y_n \mid t_n = 0)} = \exp\biggl( \frac{2 x y_n}{\sigma^2} \biggr) $$ is not to be thought of as expressing a function of $y_n$. Rather, let $q_n(b) := P(y_n \mid t_n = b)$---this is a function of the bit $b \in \{0,1\}$. The likelihood ratio is saying that $$ \frac{q_n(1)}{q_n(0)} = \exp( 2 x y_n / \sigma^2 ). $$ Thus it is a binary symmetric channel with time-varying flip-probability. Converting this into the flip parameter $f_n$ requires knowledge of the prior. For example, if the prior is uniform, then I think $$ \frac{1 - f_n}{f_n} = \frac{q_n(1)}{q_n(0)}, \quad\text{ie}\quad f_n = \frac1{1 + q_n(1)/q_n(0)}. $$ I could be mistaken with this final claim, however.

5
On

You are right. It is not time-variant. I just checked MacKay's Example 25.1 and in that configuration, such an assertion is wrong. I just paraphrase what you have already established:

Assume that at instance $n$, the transmitter sends $s_n=+x$ for $t_n=1$, and $s_n=-x$ for $t_n=0$. The receiver gets $y_n=s_n+\nu$, where $\nu \sim \mathcal{N}(0, \sigma^2)$, so, $y_n \sim \mathcal{N}(s_n, \sigma^2)$. The ML decoder is simply: $$ \exp\left(\frac{2xy_n}{\sigma^2}\right) \overset{+x}{\underset{-x}{\gtrless}} 1 \Longrightarrow y_n \overset{t_n=1}{\underset{t_n=0}{\gtrless}} 0 $$

that is, it decides based on the sign of $y_n$. Now, $$\begin{align*} \Pr\{\text{Error}|t_n=1\} & = \Pr\{y_n<0|t_n=1\}=\Pr\{y_n<0|s_n=+x\} = Q\left(\frac{x}{\sigma}\right) \\ \Pr\{\text{Correct}|t_n=1\} & = \Pr\{y_n>0|t_n=1\}=\Pr\{y_n>0|s_n=+x\} = 1-Q\left(\frac{x}{\sigma}\right) \\ \Pr\{\text{Error}|t_n=0\} & = \Pr\{y_n>0|t_n=0\}=\Pr\{y_n>0|s_n=-x\} = Q\left(\frac{x}{\sigma}\right) \\ \Pr\{\text{Correct}|t_n=0\} & = \Pr\{y_n<0|t_n=0\}=\Pr\{y_n<0|s_n=-x\} = 1-Q\left(\frac{x}{\sigma}\right) \\ \end{align*}$$

which describes a time-invariant binary symmetric channel.