Show that $\hat{β}_1 = \dfrac{\sum_{i=1}^nx_iy_i}{\sum_{i=1}^nx_i^2}$ under the least squares optimality criterion.

46 Views Asked by At

I need to prove that $\hat{β}_1 = \dfrac{\sum_{i=1}^nx_iy_i}{\sum_{i=1}^nx_i^2}$. I have not seen this as a definition for $\hat{β}_1$ before and am having trouble even starting this proof, but it must have something to do with the least-squares normal equations and the least-squares estimators.

Any thoughts?

1

There are 1 best solutions below

0
On

Denote the data by $(X_i, Y_i),$ for $i = 1, 2, \dots, n.$ The least-squares line is $\hat Y = \hat\beta_0 + \hat\beta_1 X_i.$ You need to minimize $Q = \sum_i(Y_i - \hat Y_i)^2.$

To do this set the partial derivative of $Q$ with respect to $\hat \beta_1$ equal to $0.$ and solve for $\hat \beta_1$ in terms of $X_i$ and $Y_i.$ That is: $\frac{\partial Q}{\partial \hat \beta_1} = 0.$

Please check the context of this exercise carefully. It may be that your text has defined $x_i = (X_i - \bar X)$ and $y_i = (Y_i - \bar Y).$ This convention is especially common in the UK, Australia, and New Zealand.

Notes: When finding $\frac{\partial Q}{\partial \hat \beta_1}\!:\,$ (1) The data $X_i$ and $Y_i$ are treated as constants. (2) $\hat \beta_0$ is also treated as a constant. These comments are obvious, but in my experience temporarily forgetting (1) or (2) accounts for the majority of errors in the minimization procedure.