Elements of alphabets $X$ and $Y$ are statistically related. It is known that $H(X)=4$ bits and $H(Y) =11$ bits.
What are a range of variation for a conditional entropy $H(Y|X)$ and $H(X|Y)$ changes from min to max?
Elements of alphabets $X$ and $Y$ are statistically related. It is known that $H(X)=4$ bits and $H(Y) =11$ bits.
What are a range of variation for a conditional entropy $H(Y|X)$ and $H(X|Y)$ changes from min to max?
Short answers are as following: \begin{align*} \max H(X|Y)&=4,\\ \max H(Y|X)&=11,\\ \min H(X|Y)&=0. \\ \min H(Y|X)&=7, \end{align*} To show the first two equations, first note that $H(Y|X) \leq H(Y)=11$ and $H(X|Y) \leq H(X)=4$. Now, we give an example to show that they are achievable. Suppose that X is a 4 bit random variable, where each bit is uniform and independent of other bits, and Y is a 11 bit random variable, where each bit is uniform and independent of other bits, and also independent of X. This gives $H(X|Y)=4$ and $H(Y|X)=11$.
To show the last two relations, first notice that $H(X|Y)\geq 0$ and moreover $$H(Y|X)=H(X,Y)-H(X)=H(X|Y)+H(Y)-H(X)=7+H(X|Y) \geq 7.$$ Now, we show that these bounds are tight. Suppose, Y is a 11 bit random variable, where each bit is uniform and independent of other bits and X is equal to the first four bits of Y. Then you have $H(X|Y)=0$ and $H(Y|X)=7$.