Is this inequality true: $E(A \mid A\leq B) \geq E(A \mid A\leq \min(B,C))$?

51 Views Asked by At

Let $A,B,C$ be any (continuously distributed, if you need) random variables such that $A$ is independent of $(B,C)$ (B and C may be correlated). I want to show that (or find sufficient conditions for)

$$ E(A \mid A\leq B) \geq E(A \mid A\leq \min(B,C)). $$

I expected this should be trivially true, but couldn't prove it despite my best effort for weeks to the point that I am not even sure if it's true anymore...

Can anyone provide proof or a counter-example, please? When proving, you are welcome to make any regularity assumptions such as random variables being continuously distributed and having finite moments, etc., if necessary.