Entropy of Input to N Parallel BSC's Given the Output

54 Views Asked by At

Given the N noisy observations $Y_{1:N}$ of a length-$L$ binary input sequence X passed through N parallel BSC's with the same crossover probability $p$, what is the entropy of the input sequence: \begin{equation} H(X|Y_{1:N}) \end{equation}

1

There are 1 best solutions below

1
On

I don't think it has an explicit nice solution. Let $Y_i=X+Z_i$. I suppose you mean $Z_i$ are i.i.d. random variables with distribution $Bern(\rho)$. You can write: \begin{align*} H(X|Y_{1:N})=&-H(Y_{1:N})+H(X,Y_{1:N})\\ =&-H(Y_{1:N})+H(X)+H(Y_{1:N}|X)\\ =&-H(Y_{1:N})+H(X)+nh_b(\rho)\\ =&-H(X+Z_1,X+Z_2,\cdots,X+Z_n)+H(X)+nh_b(\rho). \end{align*} However, the term $H(X+Z_1,X+Z_2,\cdots,X+Z_n)$ cannot be simplified for the general case.