Let $g:\mathbb R \to (0,\infty), X$ real valued random variable and $g(X) \in \mathcal L^2$ and $g$ strictly monotonic increasing.
Show, that $\frac {E[Xg(X)]}{E[g(X)]} \ge E[X]$
I tried something with expected values and their correlation with covariance, but I don't get the final result.
Since (the identity and) $g$ is strictly increasing, the covariance of $X$ and $g(X)$ is nonnegative. Hence $$ 0\leq \operatorname{Cov}(X,g(X))= \mathbb{E}[Xg(X)]-\mathbb{E}[g(X)]\mathbb{E}X $$ and we can divide by $\mathbb{E}[g(X)]>0$.