conditional entropy inequality

523 Views Asked by At

Does the "information can't hurt" inequality for conditional entropy $H(X)\ge H(X\mid Y)$ extend to $H(X\mid Y)\ge H(X\mid Y,Z)$?

1

There are 1 best solutions below

0
On

Yes, of course. That's a consequence of mutual information being non-negative also if conditioned on other variable: $I(X;Z | Y) = H(X | Y) - H(X | Y,Z) \ge 0$

BTW: the "information can't hurt" motto is basically right, but it can be wrongly understood. It might lead to you conclude that $I(R;S|T)\ge I(R;S )$ (knowledge about $T$ can't reduce the mutual information between $R,S$ ... right? wrong, it can).