The expected value of mean sum of square for the simple linear regression

2.3k Views Asked by At

I need some help with this:
Consider the simple linear regression $y = β_0 + β_1*x + e$, $E(e)=0$, $Var(e)=σ^2$ and $e$ uncorrelated.
I need to show that $E(MS_R)=σ^2+β_1^2*S_{xx}$, how do I do this? Here $MS_R$ is the mean sum of squares for the regression, $MS_R=∑_{i=1}^n(ŷ_i-y ̅)^2$ and $S_{xx}=∑_{i=1}^n(x_i-x ̅)^2$
And also; $E(MS_{Res})=σ^2$.
Please help me MathStack, you're my only hope.

1

There are 1 best solutions below

5
On BEST ANSWER

For the simple regression model $MS_{reg} = SS_{reg}$, \begin{align} \mathbb{E}[MS_R|X] &= \mathbb{E} [\sum_{i=1}^n (\hat{Y}_i - \bar{Y}_n)^2 | X]\\ &=\mathbb{E}[\sum(\hat{\beta}_0 + \hat{\beta}_1X_i - \bar{Y}_n)|X]\\ &=\mathbb{E}[\sum( \hat{\beta}_1\bar{X}_i- \bar{X}_n)^2|X]\\ & = \sum(x_i - \bar{x})^2\mathbb{E}[\hat{\beta}_1^2|X]\\ &=\sum(x_i-\bar{x}_n)^2\frac{\sigma_{\epsilon}^2}{\sum (x_i - \bar{x}_n)^2} + \beta_1^2\sum(x_i - \bar{x}_n)^2\\ & = \sigma_{\epsilon}^2 + \beta_1^2\sum(x_i - \bar{x}_n)^2. \end{align} For $\mathbb{E}[MSres|X]$, \begin{align} \mathbb{E}[MS_{res}|X] & = \frac{1}{n-2}\mathbb{E}[SST - SS_{reg}|X]\\ & = \frac{1}{n-2}\mathbb{E}[\sum (Y_i - \bar{Y}_n)^2|X]-\frac{1}{n-2}\mathbb{E}[SS_{reg}|X], \end{align} now plug in the previous result in the second summand and for the first one - substitute $Y_i$ and $\bar{Y}_n$ with $\beta_0 + \beta_1 X_i + \epsilon_i$ and $\beta_0 + \beta_1 \bar{X}_n + \bar{\epsilon}_n$ and then just proceed with the algebra until you get the required result. Namely, \begin{align} \mathbb{E}[MS_{res}|X] & = \frac{1}{n-2}\mathbb{E}[\sum (\beta_1(X_i - \bar{X}_n) + (\epsilon_i - \bar{\epsilon}))^2|X]-\frac{1}{n-2}\mathbb{E}[SS_{reg}|X]\\ & = \frac{\beta_1^2}{n-2}\sum (x_i - \bar{x}_n)^2 + \frac{1}{n-2}\mathbb{E}[(\epsilon_i - \bar{\epsilon}))^2|X]-\frac{1}{n-2}\mathbb{E}[SS_{reg}|X]\\ & = \frac{(n-1)\sigma^2 - \sigma^2 }{n-2}\\ & = \sigma^2. \end{align}