Grouping Property in Entropy

847 Views Asked by At

Can someone please provide me with steps of the following

Question

What I tried - I understand the grouping property, I thought of H(X) as H(I)+H({Xi} | I) But I feel lost, and not in the right direction

1

There are 1 best solutions below

1
On BEST ANSWER

Note that (show it) $$ H(X)=H(I)+H(X\mid I)-H(I\mid X). $$

Also, $H(X\mid I=k)=H(X_k)$, therefore,

$$ H(X\mid I)=\sum_I \mathbb{P}(I=k) H(X_k). $$

Finally, if $\mathcal{X}_i \cap \mathcal{X}_j=\emptyset$, for all $i,j$, it follows that knowledge of $X$ implies perfect knowledge of $I$, therefore, $H(I\mid X)=0$ in this case. If the sets overlap, then you cannot always be certain about $I$ given the value of $X$, as you may observe a value that is common to two or more variables (and, therefore, corresponds to two or more values of $I$). In that case, $H(I|X)>0$.