Finding the conditional entropy on the sum of independent random variables

247 Views Asked by At

I have two independent random variables $X_1$ and $X_2$. I want to find the differential entropy defined as $$H(X_1+X_2\mid X_1)=\int_{X_1} \int_{X_2} p_{X_1,X_2}(x_1,x_2)\log\left(\frac{1}{p_{X_1+X_2\mid X_1}(x_1+x_2\mid x_1)}\right) \, dx_1 \, dx_2.$$ In order to find this I need the formula for $$p_{X_1+X_2\mid X_1}(x_1+x_2\mid x_1).$$ I know that the conditional cumulative distribution function of $X_1+X_2$ given $X_1$ is given as follows $$F_{X_1+X_2\mid X_1}(x_1+x_2\mid x_1) = F_{X_2}(x_2).$$ But I do not how to show that $p_{X_1+X_2\mid X_1}(x_1+x_2\mid x_1) = p_{X_2}(x_2).$ Any help in showing that $p_{X_1+X_2\mid X_1}(x_1+x_2\mid x_1) = p_{X_2}(x_2)$ will be much appreciated. Thanks in advance.

1

There are 1 best solutions below

2
On

Differentiate both sides of $$F_{X_1 + X_2 \mid X_1} (x_1 + x_2 \mid x_1) = F_{X_2}(x_2)$$ with respect to $x_2$.