What's the sum of all events? (not the sum of all probabilites of events)

56 Views Asked by At

I need to calculate the entropy $h(X|Y)$, where $Y=X^2$. In this case, I suppose $\mathrm p(x|y)=\frac{1}{2}$. For the entropy

\begin{align} h(X|Y) &= \int\limits_y \mathrm p(y)\ h(X|Y=y)\ \mathrm d y \\ &= -\int\limits_y \mathrm p(y) \int\limits_x \mathrm p(x|y)\, \log \mathrm p(x|y) \ \mathrm d x \, \mathrm dy \\ &=-\int\limits_y \mathrm p(y) \int\limits_x \frac{1}{2} \, \log \frac{1}{2} \ \mathrm d x \, \mathrm dy \\ &=-\frac{1}{2} \, \log \frac{1}{2} \int\limits_y \mathrm p(y) \int\limits_x \mathrm d x \, \mathrm d y. \end{align} Now I'm stuck in the result of $\displaystyle \int\limits_{x=-\infty}^{\infty} \mathrm d x $.

1

There are 1 best solutions below

2
On

Two ways to approach this: one can either use the quite general identity $$ H(X,Y)=H(X\mid Y)+H(Y)=H(Y\mid X)+H(X). $$ Together with the fact that, for every measurable function $u$, $H(u(X)\mid X)=0$, this yields $$ H(X\mid X^2)=H(X)-H(X^2). $$ Or, one can go back to the definition $$ H(X\mid Y)=\sum_{(x,y)}p(x,y)\log\left(\frac{p_Y(y)}{p_X(x)}\right), $$ which, when $Y=X^2$, using $p(x,x^2)=p_X(x)$ for every $x$ and $p_Y(x^2)=p_X(x)+p_X(-x)$ for every $x\ne0$, yields $$ H(X\mid X^2)=\sum_{x}p_X(x)\log\left(\frac{p_X(x)+p_X(-x)}{p_X(x)}\right), $$ for the same end result.