Question:
Consider the following linear regression with one parameter (intercept $\beta_{0}$ known).
$ y_{i} = \beta_{0}^{*} + \beta_{1}x_{i} + \epsilon_{i} $ for i = 1,....n
a) Compute the LSE , MLE , mean and varience for $\beta_{1}$.
I understand how to derive the following when both the slope and intercept is unknown. However I'm unsure what the effect of knowing what the intercept will have on the new computations.
Attempt: We minimsise the LSE by setting
$\partial_{\beta_{1}} S(\beta_{1}) = - \sum_{i=1}^{n} 2x_{i}(y_{i}-\beta_{0}^{*}-\beta_{1}x_{i}) = 0 $
After long computations we get
$\beta_{1} = \frac{\sum_{i=1}^{n} x_{i} (y_{i} - \beta_{0}^{*}) }{\sum_{i=1}^{n}(x_{i}^2)}$
The computation of the least square estimator doesn't seem to care wether $b_{0}$ is known or unknown so would be get the same LSE,MLE,mean and variance as with the case $b_{0}$ is unknown
Hint
Do exactly what is done with the two parameters $\beta_0,\beta_1$ but with only $\beta_1$ as a parameter. For example, it means that the partial derivative $\frac{\partial}{\partial \beta_1}$ is in fact a usual derivative depending of $x_i, y_i$ and $\beta_0$ which is supposed to be known.