Show, that $\frac {E[Xg(X)]}{E[g(X)]} \ge E[X]$ when $g$ strictly monotonic increasing

232 Views Asked by At

Let $g:\mathbb R \to (0,\infty), X$ real valued random variable and $g(X) \in \mathcal L^2$ and $g$ strictly monotonic increasing.

Show, that $\frac {E[Xg(X)]}{E[g(X)]} \ge E[X]$

I tried something with expected values and their correlation with covariance, but I don't get the final result.

2

There are 2 best solutions below

0
On BEST ANSWER

Since (the identity and) $g$ is strictly increasing, the covariance of $X$ and $g(X)$ is nonnegative. Hence $$ 0\leq \operatorname{Cov}(X,g(X))= \mathbb{E}[Xg(X)]-\mathbb{E}[g(X)]\mathbb{E}X $$ and we can divide by $\mathbb{E}[g(X)]>0$.

0
On

This is a case of a more general inequality.

Hint: Let $f$ and $g$ both be monotonically increasing functions. Let $X_1, X_2$ be i.i.d. copies of $X$ and consider the sign of $$ (g(X_1)-g(X_2))(f(X_1)-f(X_2)). $$