I am working on this problem and having trouble starting.
Consider the regression model $$y_t = \beta_1 y_{t-1}+e_t$$ where $e_t$ is the white noise with zero-mean and variance $\sigma^2_e$. Assume that we observe $y_1, y_2, ... , y_n$ and consider the model above for $t = 2, 3, ..., n$. Show that the "least squares estimator" of $\beta_1$ is $$\hat{\beta} = \frac{\sum_{t=2}^n y_ty_{t-1}}{\sum y^2_{t-1}}$$
I am not familiar with regression analysis and I have came across this problem somewhat unexpectedly, so I tried to learn what least squares estimators are and I ended up seeing things such as MMSEs and regression lines.
Honestly I did not really get anywhere besides MAYBE I finde the MSE then use calculus to minimize it?
where the MSE I think should be
$$\begin{align} MSE[y_t,y_{t-1}] & = E[(y_t-\beta_1y_{t-1})^2] \\ & = E[y_t^2]-2\beta_1E[y_ty_{y-1}]+\beta^2_1E[y_{t-1}^2] \end{align}$$
I also considered the MLE method
$$L[\beta_1] = \Pi_{i=2}^{n}\left(\beta_1y_{i-1}+e_i\right)$$
but the $+e_t$ term threw me off and I am not sure if this seems like the right path.
I would really appreciate your help.
Consider the regression model (AR(1) model without constant term): $$y_t=\beta_1y_{t-1}+e_t.$$ In order to estimate $\beta_1$, we can apply Least Squares Method, i.e. determine $\beta_1$ for which $$\sum_{t=2}^{n}e_{t}^{2}=\sum_{t=2}^{n}(y_t-\beta_1y_{t-1})^2\rightarrow min.$$ By taking the derivative of $\sum_{t=2}^{n}(y_t-\beta_1y_{t-1})^2$ with respect to $\beta_1$, we have that $$\frac{\partial \sum_{t=2}^{n}(y_t-\beta_1y_{t-1})^2}{\partial\beta_1} =-2\sum_{t=2}^{n}(y_t-\beta_1y_{t-1})y_{t-1}.$$ Then solving the following equation for $\beta_1$ $$-2\sum_{t=2}^{n}(y_t-\beta_1y_{t-1})y_{t-1}=0,$$ we find that $$\beta_1=\frac{\sum_{t=2}^{n}y_ty_{t-1}}{\sum_{t=2}^{n}y_{t-1}^2}$$