Given a multiple linear regression model:
$Y=Xw$
I would like to prove that the least squares estimator of $w$ is $w$ itself.
I tried the following:
$$\sum_{i=1}^m (Y_i-w*X_i)^2 $$
now we will differentiate it and minimize it (suppose that $v$ is the estimator of $w$):
$$\sum_{i=1}^m (Y_i-v*X_i)*X_i =0$$
and I got:
$$ v= \sum_{i=1}^m (Y_i*X_i) \over \sum_{i=1}^m (X_i^2)$$
But then I got stuck. How can I continue to prove that $v=w$?
In addition, how can I show that the estimator for $\sigma^2$ equals zero?
I have to say it is somewhat strange question, but anyway - if you know that your $n$ data points came from $Y_i = wX_i$ then all your points are on a straight line that is described by $Y = wX$, or formally, $$ \min_{w} \sum_{i=1}^n(Y_i-wX_i)^2 , $$ note that you can minimize it simply by replacing $Y_i$ with $wX_i$, i.e., $$ \min_{w} \sum_{i=1}^n(Y_i-wX_i)^2 = \min_{w} \sum_{i=1}^n(wX_i-wX_i)^2 =\min 0= 0. \, $$ And the estimated variance that is $ \sigma^2 = \frac{1}{n}\sum (\hat{Y}_i -Y_i)^2$ is zero because $\hat{Y}_i = wX_i = Y_i$.