Proof of Correlation Coefficients

770 Views Asked by At

Good evening, I have a problem with an exercise:

Let $X$ and $Y$ be two real square integrable random variables with var$X>0$, var$Y>0$. The correlation Corr$(X,Y)$ quantifices how far $X$ and $Y$ differ by a linear relation $Y=aX+b$ since $a,b\in\mathbb{R}$. Prove that $$E(a^*X+b^*-Y)^2:=\min_{a,b\in\mathbb{R}} E(aX+b-Y)^2 = (1-Corr(X,Y)^2)\cdot var Y$$ Remark: The most possible linear approximation of $Y$ to $X$, is $a^*x+b^*$ defined as a linear regression.

I don't know how I prove it. Could you help me, please? I hope, you could understand the exercise.

1

There are 1 best solutions below

1
On BEST ANSWER

We know that the values of $a^*$ and $b^*$ are simply the regression coefficients (by definition)

$ a^* = \frac{cov(X,Y)}{Var(X)}$

$ b^* = \bar{Y} - a*\bar{X}$

It should be straight forward to plug these values into $E[(a^{*}X + b^{*} - Y)]$ and solve for the given identity