Can somebody explain the calculations with arrows below? And I am sorry if I have placed my post in the wrong place.
2026-03-27 23:37:56.1774654676
On
Derivation of joint entropy $H(X,Y) = H(X) + H(Y|X)$
5.4k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
A parenthesis might have helped. What he seemingly did was $$\log ( p(x) \cdot p(y|x) ) = \log (p(x)) + \log (p(y|x)),$$ splitting the sum into two parts.
As for the second arrow he did no changes to the second term of the expression but apparently the first term is independent of $y$ and so he discarded the sum.

So the first arrow is simply using the basic identity of logarithms: $\log(a\cdot b) = \log{a} + \log{b}$. $$-\sum_{x \in \mathbb{X}}\sum_{y \in \mathbb{Y}} p(x,y) \log(p(x)p(y|x)) = -\sum_{x \in \mathbb{X}}\sum_{y \in \mathbb{Y}} p(x,y) \log{p(x) -\sum_{x \in \mathbb{X}}\sum_{y \in \mathbb{Y}} p(x,y)\log{p(y|x)}}$$ And the second arrow is just using the definition of marginal probability: $P(X = x) = \sum_{y} P(X=x, Y= y)$. Therefore: $$ -\sum_{x \in \mathbb{X}}\sum_{y \in \mathbb{Y}} p(x,y) \log{p(x)} = -\sum_{x \in \mathbb{X}}\log{p(x)}\sum_{y \in \mathbb{Y}}p(x,y) = -\sum_{x \in \mathbb{X}}p(x)\log{p(x)} $$