Take two independent nonnnegative continuous random variables $X$ and $Y$. $X$ is smaller in the likelihood ratio order than $Y$. For a scalar $t>0$, is it true that
$$ E_Y(Y) - E_X(X) \geq E_Y(Y | Y > t) - E_X(X | X > t).$$
I have shown it for the following example: $X$ and $Y$ are both normally distributed with variance $\sigma$ and mean $\mu_X$, $\mu_Y$ and $\mu_Y>\mu_X$. Surely, $Y\geq_{lr} X$. The statement becomes
$$ h((t - \mu_X)/\sigma) \geq h((t - \mu_Y)/\sigma),$$ where h(x) is the hazard rate $\frac{\phi(x)}{1 -\Phi(x)}$. This is true as, for the normal distribution, $h'(x)>0$.
For the general case, I guess that a sufficient condition would be
$$ \frac{\partial E_X(X | X > t)}{\partial t} \geq \frac{\partial E_Y(Y | Y > t)}{\partial t}.$$
But I have not been able to show that this is true, even when assuming the likelihood ratio order.
If I understand you correctly, then the main conjecture is false, and here is a counter-example. The basic idea is that both $X, Y < t$ with very high prob, so the unconditioned expected values are dominated by that region, and what happens when $X, Y > t$ can be almost anything else.
Fix some tiny $\epsilon > 0$, and define $X$ as a mixed variable:
First perform a $Bernoulli(\epsilon)$ trial.
With prob $\epsilon$ the trial succeeds and $X \sim 1 + Exp(\lambda = 2)$.
With prob $1 - \epsilon$ the trial fails and $X \sim Uniform(0,1)$.
$f_X(x) = 1 - \epsilon ~~\forall x \in (0,1)$ and $f_X(x) = \epsilon \times 2 e^{-2(x-1)} ~~\forall x \ge 1$.
$Y$ is independent of $X$ and is a similar mixed variable, with different parameters:
First perform a $Bernoulli(2\epsilon)$ trial.
With prob $2\epsilon$ the trial succeeds and $Y \sim 1 + Exp(\lambda = 1)$.
With prob $1 - 2\epsilon$ the trial fails and $Y \sim Uniform(0,1)$.
$f_Y(y) = 1 - 2\epsilon ~~\forall x \in (0,1)$ and $f_Y(y) =2\epsilon \times e^{-(x-1)} ~~\forall x \ge 1$.
Now we verify that $X \le Y$ in the sense of likelihood ratio:
$\forall z \in (0,1): {f_X(z) \over f_Y(z)} = {1 - \epsilon \over 1 - 2\epsilon} > 1$
$\forall z \ge 1: {f_X(z) \over f_Y(z)} = {\epsilon \times 2 e^{-2(z-1)} \over 2\epsilon \times e^{-(z-1)}} = e^{-(z-1)} \le 1$ and is decreasing ($\to 0$).
So the likelihood ratio is non-increasing, as required. Now the expected values:
LHS $= E[Y] - E[X] = ({1 - 2\epsilon \over 2} + 2 \epsilon (1+1) ) - ({1 - \epsilon \over 2} + \epsilon (1 + \frac12) ) \approx 0$ for small enough $\epsilon$.
For $t=1$, RHS $= E[Y \mid Y > 1] - E[X \mid X > 1] = (1+1) - (1 + \frac12) = \frac12 >$ LHS for small enough $\epsilon$.
If you do not like the fact the likelihood ratio was constant in $z\in (0,1)$, it should be possible to perturb the uniform distributions into something different. I did not attempt it since it looked messy and did not add to the main insight. Also I did not consider your last paragraph about the partial derivatives.