Conditional entropy of repetition code over BSC

263 Views Asked by At

Consider the channel that takes in a bit, repeats it $k$ times, then sends the result over a binary symmetric channel with transition probability $p$. For example, if $0$ was sent over the channel without error, the result would be $\underbrace{00\cdots0}_{k\text{ times}}$.

I'd like to calculate the conditional entropy $H(Y|X)$, where $X$ is the input to the channel and $Y$ is the output. Assume $X$ has a uniform distribution, i.e. an equal chance of yielding $1$ or $0$.

Note that

$$H(Y|X) = -\sum_{x,y} p(x,y) \log p(y|x).$$

By Bayes' rule, $p(x,y) = p(y|x)p(x) = \frac{1}{2} p(y|x)$. Also

$$p(y|x) = (1-p)^{k-d} p^d,$$

where $d$ is the number of places with $y_i \ne x$ (for $1 \le i \le k$).

Let $w : \mathbb{B}^k \to \{0,\ldots,k\}$ be the Hamming weight and let $q = 1-p$ for convenience. Then

$$\begin{align} H(Y|X) &= -\frac{1}{2} \sum_{x,y} p(y|x) \log p(y|x) \\ &= -\frac{1}{2} \sum_{y} p(y|0) \log p(y|0) + p(y|1) \log p(y|1) \\ &= -\frac{1}{2} \sum_{y} q^{k-w(y)} p^{w(y)} \log\left(q^{k-w(y)} p^{w(y)}\right) + q^{w(y)} p^{k-w(y)} \log\left(q^{w(y)} p^{k-w(y)}\right) \\ &= -\frac{1}{2} \sum_{i=0}^k \binom{k}{i} \left[ q^{k-i} p^i \log\left(q^{k-i} p^i\right) + q^i p^{k-i} \log\left(q^i p^{k-i}\right) \right]. \end{align}$$

Does this derivation look right? It's not matching up with a simulation I'm doing, but I'm not sure which is wrong.

1

There are 1 best solutions below

1
On BEST ANSWER

According to your first paragraph, your "outer" channel has only two input symbols: $X\in \{000\cdots 0, 111....1\}=\{{\bf 0},{\bf 1}\}$

And $P(Y|X={\bf 0})=p^w(1-p)^{k-w}$

If they are equiprobable, by symmetry $$H(Y|X)=H(Y|X={\bf 0})=- \sum_{w=0}^k {k \choose w} p^w(1-p)^{k-w} \log \left[p^w(1-p)^{k-w}\right]$$

This coincides with your solution, as far as I see.