According to the Wikipedia article on conditional entropy, $\sum p(x,y)\log p(x)=\sum p(x)\log p(x)$. Can someone please explain how?
2026-03-31 09:51:00.1774950660
On
Probability and Entropy
301 Views Asked by user23844 https://math.techqa.club/user/user23844/detail At
3
There are 3 best solutions below
0
On
I find the notation irritating, whereby the same letter, $p$, is used to refer to two or more different functions.
I could write an algebraic proof, but I wonder if a concrete example might not shed more light. Suppose we have $$ \begin{align} P(X=0\ \&\ Y=0) = 1/10 & & P(X=0\ \&\ Y=1) = 2/10 \\ \\ P(X=1\ \&\ Y=0) = 3/10 & & P(X=1\ \&\ Y=1) = 4/10 \end{align} $$ Then $P(X=0)=4/10$ and $P(X=1)= 6/10$. So the first sum above is $$ \underbrace{(1/10)\log (4/10) + (3/10)\log(4/10)} + \underbrace{(2/10)\log(6/10) + (4/10)\log(6/10) }. $$ The second sum above is $$ (4/10)\log(4/10) + (6/10)\log(6/10). $$ So it's really just the distributive law.
$$\sum_{x,y} p(x,y)\log p(x)=\sum_x\sum_yp(x,y)\log p(x)=\sum_x\left(\sum_yp(x,y)\right)\log p(x)=\sum_x p(x)\log p(x)$$