Is it true that $\frac{E[(|X - E[X]|)(|Y - E[Y]|)]}{\sigma_X \sigma_Y} = 1$?

75 Views Asked by At

Consider the well-known fact that correlation is bounded between $-1$ and $1$:

$$ -1 \le \text{corr}(X,Y) = \frac{E[(X - E[X])(Y - E[Y])]}{\sigma_X \sigma_Y} \le 1. $$

I've been trying to wrap my mind intuitively around why this is so.

Question: Is this because (or is it true that)

$$ \frac{E[(|X - E[X]|)(|Y - E[Y]|)]}{\sigma_X \sigma_Y} = 1? $$

(Notice the absolute value signs in the numerator).

Example: I notice that this is true in case $X$ were to take on values $\{1, 3\}$ and $Y$ were to take on values $\{2, 6\}$ in a uniform distribution. That is:

$$ \frac{\frac{(1-2) + (3-2)}{2} \cdot \frac{(2-4)+(6-4)}{2}}{1 \cdot 2} = \frac{0 }{2} = 0 $$

yet

$$ \frac{\frac{|(1-2)| + |(3-2)|}{2} \cdot \frac{|(2-4)|+|(6-4)|}{2}}{1 \cdot 2} = 1. $$

So is it true in general? If so, this would make understanding why correlation is bounded between $-1$ and $1$ quite easy for my mind to wrap around.

EDIT: The claim also seems to work on uniform $\{1,5\}$ and $\{1,7\}$:

$$ \frac{\frac{(1-3) + (5-3)}{2} \cdot \frac{(1-4)+(7-4)}{2}}{2 \cdot 3} = \frac{0 \cdot 0}{6} = 0 $$

yet

$$ \frac{\frac{|(1-3)| + |(5-3)|}{2} \cdot \frac{|(1-4)|+|(7-4)|}{2}}{2 \cdot 3} = \frac{2 \cdot 3}{6} = 1. $$

3

There are 3 best solutions below

3
On BEST ANSWER

For a counterexample: let $X,Y$ be independent random variables with $X\sim \operatorname{Unif}\{-1,1\}$ and $Y\sim \operatorname{Unif}\{-1,0,1\}$.

Then we have $\mathbb{E}[X] = \mathbb{E}[Y] = 0$, $$\mathbb{E}[\lvert X\rvert] = \mathbb{E}[X^2] = \operatorname{Var} X = 1,$$ and $$\mathbb{E}[\lvert Y\rvert] = \mathbb{E}[Y^2] = \operatorname{Var} Y = \frac{2}{3}.$$ This leads to $$ \frac{\mathbb{E}[\lvert X\rvert\lvert Y\rvert]}{\sqrt{\operatorname{Var} X}\sqrt{\operatorname{Var} Y}} = \frac{\mathbb{E}[\lvert X\rvert]\cdot\mathbb{E}[\lvert Y\rvert]}{\sqrt{\operatorname{Var} X}\sqrt{\operatorname{Var} Y}} = \frac{\mathbb{E}[\lvert Y\rvert]}{\sqrt{\operatorname{Var} Y}} = \sqrt{\frac{2}{3}} \neq 1 $$ where the first equality follows from independence.

1
On

I think this is just Cauchy's inequality: \begin{align}\big|E[(X-E[Y])(Y-E[Y])]\big| &= \left| \int_\Omega (X(\omega) - e_x) \, (Y(\omega) - e_y) \, \mathrm d \omega \right| \\&\le \left| \int_\Omega (X(\omega) - e_x)^2 \, \mathrm d \omega \right|^{1/2} \, \left| \int_\Omega (Y(\omega) - e_y)^2 \, \mathrm d \omega \right|^{1/2} = \sigma_X \, \sigma_Y\end{align} with $e_x = E[X]$ and $e_y = E[Y]$.

0
On

Let it be that $X,Y$ both have mean $0$ and both have variance $1$. Then your claim takes the form: $$\mathbb E|XY|=1$$

If $X,Y$ are moreover independent then it takes the form:$$\mathbb E|X|\mathbb E|Y|=1$$

If moreover $X,Y$ have equal distribution then it takes the form: $$\mathbb E|X|=1$$

Can you find a random variable $X$ having mean $0$, variance $1$ and $\mathbb E|X|\neq1$?

(There are plenty)

Then you found a counterexample.