Consider a regression problem where the data values y1, . . . , yn are observed values of response variables Y1, . . . , Yn. In the notes we assume that, for given values x1, . . . , xn of the predictor variable, the Yi satisfy the simple linear regression model Yi = a + bxi + ei, where the ei are i.i.d.~ N(0, sigma^2). The least squares estimates of the regression parameter(s) are defined to be the values which minimise the sum of squares of the differences between the observed yi and the fitted values. For this model, E(Yi | xi) = a + bxi, so the least squares estimates are the values minimising [sigma (I=1,..n)(yi-(a+bxi))^2] Now consider alternative model which takes the form Yi = gammaxi+ei, i=1,...,n,where the ei satisfy the same assumptions as before but where there is now a single unknown regression parameter gamma . This model is sometimes used when it is clear from the problem description that the value of E(Y) must be zero if the corresponding x value is zero. Derive an expression, in terms of the xi and yi values, for the least squares estimate for gamma for this new model and suggest, with reasons, an appropriate estimate for sigma^2.
I don't really have a clue where to start, I know how to work out the least squares estimates of a and b using summary statistics, but I'm not at all sure how to apply this to finding gamma or sigma^2. I'm also not sure how adding on normally distributed ei might fit in?...
In the simple regression model, the least squares estimates for $a$ and $b$ were chosen to minimize $\sum_{i=1}^n (y_i - (a+ b x_i))^2$. (This is where the name least squares comes from.) To get the actual expressions you may have had to do some calculus.
Can you think about what changed in the second model, and figure out how you would modify the above procedure, to estimate $\gamma$?
Similarly, modify the procedure for estimating $\sigma^2$ in simple regression, in order to find the appropriate estimate in the second model.