Gaussian integration by parts proof

829 Views Asked by At

So the result I am trying to show is that

Lemma: (Gaussian integration by parts) Let $X\sim \mathcal{N}(0,1)$. If $f$ is a differentiable function then

$$E[f'(X)] = E[Xf(X)]$$

This is Lemma 7.2.3 in Vershynin's book, High Dimensional Probability book. If $f$ is compactly supported the proof is quite straightforward. Now to finish the proof we can approximate a general $f$ by a sequences of compactly supported functions $f_n$, but I am a little unsure of the exact details.

Issue 1: For instance is it always true we can find a sequence of compactly supported $C^1$ functions such that $f_n$ converges uniformly to $f$ and whose derivatives also converges uniformly. What if we assumed $f$ is continuously differentiable then is the statement true?

Assuming we can find such functions $f_n$ then we have $E[f'_n(X)]\rightarrow E[f'(X)]$. We note that $E[f'_n(x)]=E[Xf_n(X)]$. Then since $f_n$ converges uniformly to $f$ for any $\epsilon>0$ there exists an $N$ such that $n>N$ implies $|f_n-f|_\infty < \epsilon$. Thus

$$\left|E[Xf_n(X)]-E[Xf(x)]\right| \leq E|Xf_n(X)-Xf(X)| \leq \epsilon E|X|$$

For $n>N$. This implies $E[f'(X)]=E[Xf(X)]$.