Maximum correlation inequality

59 Views Asked by At

Let $ X $ and $ Y $ be two random variables and suppose $ A \subset B $ are two subsets of $ \mathbb{R} $. Let $ \rho(X, Y) $ denote the correlation between $ X $ and $ Y $. I am wondering whether $$ \sup_{g \; \text{monotone}}|\rho(g(X),Y | X \in A)| \geq \sup_{g \; \text{monotone}}|\rho(g(X),Y | X \in B)|. $$ This seems intuitively true, since the maximum absolute correlation between a random variable of the form $ g(X) $ for $ g $ increasing and $ Y $ should be larger if $ X $ belongs to a smaller subset, but I'm not sure how to show it. Any thoughts?

1

There are 1 best solutions below

4
On BEST ANSWER

If I understand you correctly, when you write $\rho( g(X), Y \mid X \in A)$ you are talking about the correlation when both $X$ and $Y$ (i.e. the joint $(X,Y)$) are conditioned on $X \in A$, right?

If my understanding is correct, wouldn't a simple counter-example be when $X,Y$ are independent when conditioned on $X\in A$? In that case $LHS =0$, but it is easy to make $RHS > 0$.