How to compute $H(X \mid Y)$ and $H(Y \mid X)$ when $Y$ is fully dependent on $X$?

86 Views Asked by At

Let $X$ be a RV with values in the set $S = \{1, \ldots, n\}$ and probability distribution $p_k$ for $k = 1, \ldots , n$. Let $T \subseteq S$ and suppose that $P(X \in T) = \alpha$. Let $Y$ be a RV with values in $\{0, 1\}$ and with $Y = 1$ if $X \in T$ and $Y = 0$ otherwise.

Compute $H(X\mid Y)$ and $H(Y \mid X)$.

I know that the formula for mutual entropy is as follows:

$$H(X \mid Y) = - \sum_{x,y \in \mathcal{X}} p_{xy} \log \biggl(\frac{p_{xy}}{p_y}\biggr)$$

But I do not see how to compute them in this case as here are no "numbers". What is clear to me however is that knowledge of $X$ offers total knowledge of $Y$, but I do not quite see how this could help here. Could you help me?

2

There are 2 best solutions below

0
On BEST ANSWER

You can calculate the mutual entropy by noting that whenever $Y=1\iff x\in T$ and thus

$$\begin{align}H(X|Y)&=-(\sum_{Y=0,x\notin T}+\sum_{Y=1,x\in T})p_{XY}\log{\frac{p_{XY}}{p_Y}}\\&=-\sum_{i\in T}p_i\log\frac{p_i}{\alpha}-\sum_{i\notin T}p_i\log\frac{p_i}{1-\alpha}\\&=-\sum_{i\in S}p_i\log p_i+\alpha \log\alpha+(1-\alpha)\log(1-\alpha)\\&=S(X)-S(Y)\end{align}$$

This concurs with our intuition that the outcomes of the variable $Y$ are giving us information about the value of the variable $X$, and therefore they should be reducing the entropy (the lack of information) of $X$.

Obviously, as pointed out in one of the comments, $Y$ is completely determined by $X$ and therefore, $H(Y|X)=0$, since $p_{XY}=p_{X}$ for all possible values of $X,Y$.

0
On

Another approach: Since $Y$ is a function of $X$, the pair $(X,Y)$ has the same entropy as $X$ alone, so $H(X,Y)=S(X)$.

Now the chain rule gives $H(Y\mid X)= H(X,Y)-S(X)=0$ and $$H(X\mid Y) = H(X,Y) - S(Y) = S(X) - S(Y),$$ where we calculate $$ \begin{align} -S(Y)&= P(Y=0)\log P(Y=0) + P(Y=1)\log P(Y=1)\\ &= P(X\not\in T)\log P(X\not\in T) + P(X\in T)\log P(X\in T)\\ &=(1-\alpha)\log(1-\alpha) + \alpha\log \alpha. \end{align} $$