I'm trying to calculate the H(X, Y, Z) of the cascaded BSC channel. So I calculated partly but I couldn't calculate some parts. Could you please help me?
Here is the Channel diagram -
$$H(X,Y,Z)=H(Y,Z)+H(X|Y,Z) $$ I calculate the first term H(Y, Z) like as below.
$$H(Y,Z)=H(Y)+H(Z|Y) $$ $$H(Y)=1/2$$ $$H(Z|Y) = -\sum p(y)\sum p(z|y)\log_2p(z|y))$$ $$=-p(y=0).p(z=0|y=0)\log_2p(z=0|y=0)$$ $$-p(y=0).p(z=1|y=0)\log_2p(z=1|y=0)$$ $$-p(y=1).p(z=0|y=1\log_2p(z=0|y=1)$$ $$-p(y=1).p(z=1|y=1)\log_2p(z=1|y=1$$ $$=-\left [ 0.5(1-\varepsilon ) \log_2(1-\varepsilon )+0.5(1-\varepsilon ) \log_2(1-\varepsilon )+0.5(\varepsilon ) \log_2(\varepsilon )+0.5(\varepsilon ) \log_2(\varepsilon )\right ]$$ $$=-(1-\varepsilon ) \log_2(1-\varepsilon )-(\varepsilon ) \log_2(\varepsilon )$$
How could I calculate the below term? I couldn’t understand how to do that. $$H(X│Y,Z)=?$$
This is a Markov process $X \to Y \to Z$, hence $$H(X,Y,Z) = H(Z|Y,X) + H(Y|X)+H(X)=H(Z|Y) + H(Y|X)+H(X)$$
Now, $H(Y|X)=H(Z|Y)=h(\epsilon)$ where $h(x)=-x \log(x) - (1-x) \log(1-x)$ is the binary entropy function.
As for $H(X)$, that is not something given by the channel, but by the input distribution. Assuming a uniform input distribution, $H(X)=1$ bit (I guess your $H(Y)=1/2$ is a mistake).
Hence, finally $H(X,Y,Z)= 2 h(\epsilon) +1$ bits.