conditional entropies for identical distributions

162 Views Asked by At

Let me say I have two distributions $X$ and $Y$ which are identical, but they are not independent. Now if were to calculate the conditional entropies $H(X|Y)$ and $H(Y|X)$. Is calculating one joint entropy is enough for me ?

Regards, phani tej

1

There are 1 best solutions below

1
On BEST ANSWER

From our definition of conditional entropy:

$$ H(Y|X) = \sum_{x}p_{X}(x)H(Y|X=x) = -\mathbb{E}[\text{log}p(Y|X)] $$

We know the chain rule which states that: $$ H(X,Y) = H(X) + H(Y|X) = H(Y) + H(X|Y) $$

so $$ H(Y|X) - H(X|Y) = H(Y) - H(X) = 0 $$ by the fact that X and Y have the same distribution. Of course this is clear from the symmetry.