Least Squares Revisited

63 Views Asked by At

I am reading a paper on regression and there seems to be a simple substitution but I just cannot get my head around it.

My question is how you go from equation (3) and (4) to (5)? Please let me know your thoughts. Thank you! enter image description here

1

There are 1 best solutions below

1
On BEST ANSWER

From $(4)$, you have $x_i - X_i = -m(y_i - Y_i)$, substitute these back to $(3)$, it can be seen that \begin{align} d^2 & = \sum_{i = 1}^n [m^2(y_i - Y_i)^2 + (y_i - Y_i)^2] \\ & = (1 + m^2)\sum_{i = 1}^n (y_i - Y_i)^2 \\ & = (1 + m^2)\sum_{i = 1}^n (Y_i - mx_i - c)^2 \tag{$*$} \\ \end{align} Now you might want to express $x_i$ in terms of $X_i$ and $Y_i$. By $(4)$ again and $y_i = mx_i + c$, it follows that \begin{align} x_i = X_i + mY_i - m(mx_i + c) \end{align} Solve this for $x_i$, we obtain $$x_i = \frac{X_i + mY_i - mc}{1 + m^2} \tag{$**$}$$ Substitute $(**)$ back to $(*)$, it can be seen that \begin{align} d^2 & = \sum_{i = 1}^n [m^2(y_i - Y_i)^2 + (y_i - Y_i)^2] \\ & = (1 + m^2)\sum_{i = 1}^n \left(Y_i - m\frac{X_i + mY_i - mc}{1 + m^2} - c\right)^2 \\ & = \frac{1}{1 + m^2}\sum_{i = 1}^n (Y_i -mX_i - c)^2, \end{align} as desired.

It might be worth pointing out that this type of "Least squares" is not the conventional least squares that we are familiar with, it is called orthogonal least squares regression in that we compute the distance from observation points to the underlying regression line perpendicularly, in contrast to vertically.