Derivation of Joint discrete entropy to Conditional Entropy

327 Views Asked by At

I am trying to prove this

$$H(X,Y)=H(X)+H(Y|X)$$

However, I am stuck on a certain part.

$$\begin{align} & -\sum\limits_{x}{\sum\limits_{y}{p(X,Y)\log (p(X,Y))}} \\ & =-\sum\limits_{x}{\sum\limits_{y}{p(X)p(Y|X)\log (p(X)p(Y|X))}} \\ & =-\sum\limits_{x}{\sum\limits_{y}{p(X)p(Y|X)(\log (p(X))+log(p(Y|X)))}} \\ & =-\sum\limits_{x}{\sum\limits_{y}{p(X)p(Y|X)\log (p(X))-\sum\limits_{x}{\sum\limits_{y}{p(X)p(Y|X)log(p(Y|X))}}}} \\ \end{align} $$

Let's call first summation $a$ and second summation $b$. I can easily convert $b$ to $H(Y|X)$ as follows.

$$\begin{align} & -\sum\limits_{x}{\sum\limits_{y}{p(X=x)p(Y=y|X=x)log(p(Y=y|X=x))}} \\ & =-\sum\limits_{x}{p(X=x)\sum\limits_{y}{p(Y=y|X=x)log(p(Y=y|X=x))}} \\ & =\sum\limits_{x}{p(X=x)H(Y|X=x)} \\ & =H(Y|X) \\ \end{align}$$

But for $a$ I cannot reduce it to $H(X)$

$$\begin{align} & -\sum\limits_{x}{\sum\limits_{y}{p(X=x)p(Y=y|X=x)log(p(X=x))}} \\ & =-\sum\limits_{x}{p(X=x)log(p(X=x))\sum\limits_{y}{p(Y=y|X=x)}} \\ \end{align} $$

I am stuck here. What am I missing?

1

There are 1 best solutions below

0
On BEST ANSWER

The answer is given in the chat above; the missing final step is that the final summation with the conditional expectation sums up to 1.