A sends $i,j$ and B gets $i,j$.
Does it mean that $A$ != $B$? I would know how to solve this if A would be equal to B, but now I'm not sure how should I start, when A has 2 values, and B has 4.
I need to find entropies for H(A), H(B).
A sends $i,j$ and B gets $i,j$.
Does it mean that $A$ != $B$? I would know how to solve this if A would be equal to B, but now I'm not sure how should I start, when A has 2 values, and B has 4.
I need to find entropies for H(A), H(B).
The entropy of a random variable $X$ is \begin{equation} H(X) = -\sum_{\mathrm{all~instances~} x \mathrm{~of~} X}p(x)\log_2\left(p(x)\right) \end{equation} Thus, \begin{equation} H(A) = -p(A_{00})\log_2\left(p(A_{00})\right) -p(A_{11})\log_2\left(p(A_{11})\right) \end{equation} and for $H(B)$ we have: \begin{equation} H(B) = -p(B_{00})\log_2\left(p(B_{00})\right) -p(B_{01})\log_2\left(p(B_{01})\right) -p(B_{10})\log_2\left(p(B_{10})\right) -p(B_{11})\log_2\left(p(B_{11})\right) \end{equation} and therefor, you need to find $p(B_{ij})$. Note that you can use the following (see here for law of total probability): \begin{equation} p(B_{ij}) = p(B_{ij}~|A_{00})\times p(A_{00}) + p(B_{ij}~|A_{11})\times p(A_{11}) \end{equation} to find $p(B_{ij})$. Note that $p(B_{ij}~|A_{00})$ and $p(B_{ij}~|A_{11})$ are channel crossover probabilities. For instance, if probability of flipping a bit by the channel is $q$, then \begin{equation} p(B_{01}~|A_{00}) = (1-q)q \end{equation} because in a 2-bit sequence, the first bit is not flipped and the second is flipped. Or \begin{equation} p(B_{00}~|A_{11}) = q^2 \end{equation} because both bits are flipped.