Consider the well-known fact that correlation is bounded between $-1$ and $1$:
$$ -1 \le \text{corr}(X,Y) = \frac{E[(X - E[X])(Y - E[Y])]}{\sigma_X \sigma_Y} \le 1. $$
I've been trying to wrap my mind intuitively around why this is so.
Question: Is this because (or is it true that)
$$ \frac{E[(|X - E[X]|)(|Y - E[Y]|)]}{\sigma_X \sigma_Y} = 1? $$
(Notice the absolute value signs in the numerator).
Example: I notice that this is true in case $X$ were to take on values $\{1, 3\}$ and $Y$ were to take on values $\{2, 6\}$ in a uniform distribution. That is:
$$ \frac{\frac{(1-2) + (3-2)}{2} \cdot \frac{(2-4)+(6-4)}{2}}{1 \cdot 2} = \frac{0 }{2} = 0 $$
yet
$$ \frac{\frac{|(1-2)| + |(3-2)|}{2} \cdot \frac{|(2-4)|+|(6-4)|}{2}}{1 \cdot 2} = 1. $$
So is it true in general? If so, this would make understanding why correlation is bounded between $-1$ and $1$ quite easy for my mind to wrap around.
EDIT: The claim also seems to work on uniform $\{1,5\}$ and $\{1,7\}$:
$$ \frac{\frac{(1-3) + (5-3)}{2} \cdot \frac{(1-4)+(7-4)}{2}}{2 \cdot 3} = \frac{0 \cdot 0}{6} = 0 $$
yet
$$ \frac{\frac{|(1-3)| + |(5-3)|}{2} \cdot \frac{|(1-4)|+|(7-4)|}{2}}{2 \cdot 3} = \frac{2 \cdot 3}{6} = 1. $$
For a counterexample: let $X,Y$ be independent random variables with $X\sim \operatorname{Unif}\{-1,1\}$ and $Y\sim \operatorname{Unif}\{-1,0,1\}$.
Then we have $\mathbb{E}[X] = \mathbb{E}[Y] = 0$, $$\mathbb{E}[\lvert X\rvert] = \mathbb{E}[X^2] = \operatorname{Var} X = 1,$$ and $$\mathbb{E}[\lvert Y\rvert] = \mathbb{E}[Y^2] = \operatorname{Var} Y = \frac{2}{3}.$$ This leads to $$ \frac{\mathbb{E}[\lvert X\rvert\lvert Y\rvert]}{\sqrt{\operatorname{Var} X}\sqrt{\operatorname{Var} Y}} = \frac{\mathbb{E}[\lvert X\rvert]\cdot\mathbb{E}[\lvert Y\rvert]}{\sqrt{\operatorname{Var} X}\sqrt{\operatorname{Var} Y}} = \frac{\mathbb{E}[\lvert Y\rvert]}{\sqrt{\operatorname{Var} Y}} = \sqrt{\frac{2}{3}} \neq 1 $$ where the first equality follows from independence.