Proof in regression model

94 Views Asked by At

Im studying for an exam, i don't have the solution, so I hope some of you guys can help me. I have tried a lot but i can't do this proof.

Here is the task:

Suoppose we have the linear regression model $U_i=\beta_kX_{k,i}+u_i,$ $i=1,2 .... , n,$

where $k$ is an index which we will use later ( it will be the k´th covariate in a more general linear regression model). The least squares estimator of $\beta_k$, which we call for $\hat{b}_k$ is the minimizer of

$L_k(b_k)= \sum_{i=1}^{n}(Y_i-b_kX_{k,i})^2$.

Show that the first order equation $\frac{d}{db_k}L_k(b_k)=0$ has the single solution $\hat{b}=\frac{\sum ^n_{i=1}X_{k,i}Y_k }{\sum ^n_{i=1}X^{2}_{k,i}}$

You may without justification conclude that this is the minimizer of $L_k(b_k)$