Help show that $E[h'(X)] = E[h(X)(\frac{(1+ln(X))}{X})]$

37 Views Asked by At

$$f(x) = \frac{1}{x\sqrt{2\pi}}exp[(\frac{-((ln(x))^2}{2})]$$ is the PDF of the random variable of X

Let h(x) be a known function and suppose that h'(x) exists. Assuming all expectations exist, show that $$E[h'(X)] = E[h(X)(\frac{(1+ln(X))}{X})]$$ (If needed, you may assume that $\lim_{x\to \infty}$h(x)f(x) = $\lim_{x\to 0}$h(x)f(x) = 0 )

I have started the proof and got to the part that where E[h'(x)] cannot equal 0 by using the definition of expectation but I am not sure how that helps or where to go from E[h'(X)] to E[h(X)$(\frac{(1+ln(X))}{X})$]

Any help would be great, thank you.

1

There are 1 best solutions below

0
On

Take $Y:= \ln(X)$. Clearly, $Y$ has a standard normal distribution.
Take $g(y):= h(e^y)e^{-y}$, clearly, $g(Y)=\frac{h(X)}{X}$ and $g'(y)=h'(e^y)-g(y)$
Thus, your equation is equivalent to $$ \mathbb{E}(g'(Y)+g(Y)) = \mathbb{E}( g(Y) (1+Y))$$ or $$\mathbb{E}(g'(Y)) = \mathbb{E}( g(Y) Y)$$ Which is the following Stein's lemma https://en.wikipedia.org/wiki/Stein%27s_lemma
If you want to reprove that lemma, the integration by parts is enough.