Shannon entropy property

42 Views Asked by At

I need to prove this inequality $S(A|B)\leq S(A)$ being S the Shannon entropy $S(B)=-\sum_{\beta}p(\beta)ln(p(\beta))$, and $S(A|B)=-\sum_{\alpha\beta}p(\beta)p(\alpha|\beta)ln(p(\alpha|\beta))$ using this inequality $ln(x)\leq x-1$. Help i llok in wikipedia that send me to books that doesn't prove it, and searched in many sites and didnt get a proof.

1

There are 1 best solutions below

0
On

Im not sure how to use the given inequality here, but one way to shoe the claim is by Jensens inequality. Notice that:

$$H(x) = -\sum x_i \log x_i$$

Is concave with respect to $x$. Rewriting your expression:

$$S(A|B) = \sum p(\beta) H(p(A|\beta))$$

And applying jensens inequality yields the claim.