Entropy is subadditive

439 Views Asked by At

Let $(X,\mathcal{B},\mu)$ be a probability space, abd let $\alpha$ and $\beta$ be countable-measurable paratition.

Let $H_\mu(\alpha)$ and $H_\mu(\beta)$ be the (Shanon) entropy of $\alpha$ and $\beta$.

Let $\alpha\bigvee\beta:=\{A\cap B\mid A\in\alpha, B\in \beta\}$

Is it true that $H_\mu(\alpha\bigvee\beta)\leq H_\mu(\alpha)+H_\mu(\beta)?$

My tries:

$$H_\mu(\alpha\bigvee\beta)=-\sum_{A\cap B\in\alpha\bigvee\beta}\mu(A\cap B)log(\mu(A\cap B))=$$

$$(Absolute-convergence)$$

$$=-\sum_{B\in\beta}\sum_{A\in\alpha}\mu(A\cap B)log(\mu(A\cap B))$$

But I have no idea how to continue from here.

I would be happy for hint or advice.

Thanks in advance.

1

There are 1 best solutions below

4
On BEST ANSWER

Let $\gamma:=\alpha\vee\beta$. First, note that \begin{align} I(\alpha,\beta)=\sum_{(A,B)\in\gamma}\mu(A\cap B)\log\!\left(\frac{\mu(A\cap B)}{\mu(A)\mu(B)}\right)=H(\alpha)+H(\beta)-H(\gamma). \end{align} Thus, it suffices to show that $I(\alpha,\beta)\ge 0$. Using Jensen's inequality, $$ \sum_{(A,B)\in\gamma}\mu(A\cap B)\log\!\left(\frac{\mu(A)\mu(B)}{\mu(A\cap B)}\right)\le \log\!\left(\sum_{(A,B)\in\gamma}\mu(A)\mu(B)\right)=\log(1)=0. $$