Simplifying expectation of square of sum

120 Views Asked by At

I am working on a problem (not for homework) where one step involves simplifying an expectation. The solution looks like this:

Call $p_\mu(X)$ the PDF of the variable distributed as $N(\mu, \sigma^2I)$. Let $E_0$ denote the expectation under $N(0, \sigma^2I)$.

Then $E_0\left[\left(1+q\left(\frac{p_\mu(X) - p_0(X)}{p_0(X)}\right)\right)^2\right] = 1+q^2E_0\left[\left(\frac{p_\mu(X) - p_0(X)}{p_0(X)}\right)^2\right]$.

I'm not able to derive this step; I tried expanding the square and am left with a middle term that does not seem to equal zero. I also tried using $E[X^2] = Var(X) + E[X]^2$ which also left me with extra terms. Am I missing an identity here?

For context, the full solution is 6.2b here: http://web.stanford.edu/class/stats311/Exercises/2019-solutions.pdf

1

There are 1 best solutions below

1
On BEST ANSWER

The reason for this is that the linear part looks like

$$2q E_{0} \left[ \frac{p_{\mu}(X) - p_{0}(X)}{p_{0}(X)}\right] = \int_{X} \frac{p_{\mu}(X) - p_{0}(X)}{p_{0}(X)} p_{0}(X) = \int_{X} p_{\mu}(X) - \int_{X} p_{0}(X)$$

Then since this is a difference in total probability of these two distributions it must be zero (i.e. 1-1 = 0).