Least squares regression of Y against X compared to X against Y?

5.5k Views Asked by At

Suppose that X and Y are mean zero, unit variance random variables.If least squares regression (without intercept) of Y against X gives a slope of $\beta$ (i.e. it minimises $\mathbb{E}\left[ (Y-\beta X)^2 \right]$), what is the slope of the regression of X against Y?

Is my understanding correct that the minimisation factor isn't important here due to the fact that both random variables have identical variances?

$$\beta = \frac{\mathrm{Cov}(X,Y)}{\mathrm{Var}(X)} = \frac{\mathbb{E}[X,Y] - \mathbb{E}[X]\mathbb{E}[Y]}{\mathrm{Var}(X)} = \frac{\mathbb{E}[X,Y]}{\mathrm{Var}(X)}$$

So then because $\mathbb{E}[X,Y] = \mathbb{E}[Y,X]$ and $\mathrm{Var}(X)=\mathrm{Var}(Y)$ then $\beta$ is also the slope that minimises $\mathbb{E}\left[ (X-\beta Y)^2 \right]$.

Have I understood this correctly? And is there a way to prove it using minimisation?

1

There are 1 best solutions below

0
On BEST ANSWER

Your argument works fine: applying your formula for $\beta$ to either regression will yield $\beta=Cov(X,Y)$ in both cases.


Here is a derivation of the formula for $\beta$.

If $X$ and $Y$ have zero mean, then \begin{align} E(Y-\beta X)^2 &= Var(Y) - 2 \beta Cov(X,Y) + \beta^2 Var(X)\\ &= Var(Y) - \frac{(Cov(X,Y))^2}{Var(X)} + \left(\beta \sqrt{Var(X)} - \frac{Cov(X,Y)}{\sqrt{Var(X)}}\right)^2\\ &\ge Var(Y) - \frac{(Cov(X,Y))^2}{Var(X)}, \end{align} with equality when $\beta=\frac{Cov(X,Y)}{Var(X)}$ (i.e., this is the minimizing $\beta$), which agrees with your formula.

Since $X$ has unit variance, you have $\beta=Cov(X,Y)$.

If you repeat the same computation for $E(X-\beta Y)^2$ instead, then you will still get $\beta=\frac{Cov(X,Y)}{Var(Y)}=Cov(X,Y)$ because $Y$ has unit variance.