For two points $x < x'$ and a random variable $X$, we must have $E(X\mid X > x )\leq E(X\mid X > x' )$. This is "obviously" true because the center of the truncated distribution shifts to the right. How do I prove that?
I tried working with an iid copy $X^*$ of $X$ to show that the expectation of $X1(X>x)1(X^*>x')$ is smaller than the expectation of $X1(X>x')1(X^*>x)$ but I'm not having any luck with that.
All results I can find either focus on normality or assume densities.
Let $Y$ be an iid copy of $X$.
Notice the following inequality holds $$(X-Y)(1_{X>x'}1_{Y>x}-1_{Y>x'}1_{X>x})\geq 0$$
and take expectations to find $$E(X1_{X>x'})P(Y>x)-E(X1_{X>x})P(Y>x')-E(Y1_{Y>x})P(X>x')+E(Y1_{Y>x'})P(X>x)\geq 0,$$
which rewrites $$2E(X1_{X>x'})P(X>x)-2E(X1_{X>x})P(X>x')\geq 0,$$ thus
$$\frac {E(X1_{X>x'})}{P(X>x')}\geq \frac {E(X1_{X>x})}{P(X>x)}$$