Let $X$, $Y$ be independent bounded integrable random variables and let $a$ be any constant such that $$P(X > Y + a) > 0$$
Is $$E[X | X > Y + a]$$ weakly increasing in $a$?
This thread shows this is true if $X$ and $Y$ are normal, and this thread shows its true if $Y$ is deterministic. I have a counterexample that this is not true for general correlated $X$ and $Y$. If anyone could provide a proof or counterexample I'd be very grateful!
If I understand your question correctly, the conjecture is false. Here is a simple counterexample:
$X = 1, 2$ with equal probability $1/2$.
$Y = 0, 2$ with probabilities $\epsilon, 1-\epsilon$; where $\epsilon$ is a very small positive number (think $\epsilon = 10^{-9}$).
So, conditioned on $X > Y - 0.5$, this allows the sample points $(X,Y) = (2,0), (2,2), (1,0)$. Of these $3$ sample points, $(2,2)$ dominates because of $Y=2$ is much more likely than $Y=0$. So $E[X | X > Y - 0.5] \approx 2$.
Meanwhile, conditioned on $X > Y + 0.5$, this allows the sample points $(X,Y) = (2,0), (1,0)$, and so $E[X | X > Y + 0.5] = (2+1)/2 = 1.5$.