Proving correlation coefficient = 1 or -1 given X and Y=a + bX

5.8k Views Asked by At

Given $X$ and $Y = a+bX$, I have to prove that:

If $b \lt 0$, then $\rho = -1$. If $b \gt 0$, then $\rho = 1$.

I've gotten to the point where I have:

$$ \rho = \frac{b \cdot \sigma_x }{ \sqrt{\sigma_y^2} }$$

I need to find why $\sigma_y^2 = b^2 \sigma_x^2$.

Can anyone please explain to me why that is the case?

I tried rewriting $\sigma_y^2$ as: $E[(a+bX)(a+bX)] - E^2[a+bX] $. And got: $$a^2 + 2abE[X] + b^2E[X^2] - a^2 - b^2E^2[X] ,$$ which became: $2ab E[X] + b^2 \sigma_x^2$.

What is the extra 2abE(x) term? Is that supposed to go away somehow?

Thanks.

1

There are 1 best solutions below

0
On BEST ANSWER

${\rm Var}(Y) = {\rm Var}(a + bX) = {\rm Var}(bX) = b^2 {\rm Var(X)}$.