The formula of the sampling variance of the OLS slope estimators

572 Views Asked by At

For population regression function:

$$y = \beta_0 + \beta_1 x_1 + ... + \beta_k x_k + u,$$

where u is the error; later $i$ will be used to refer to the observation and the second subscript of $x$ is the variable number, represented by $j$.

From the Jeffrey Wooldridge's text (4e), under Gauss-Markov assumptions, conditional on the sample values of the independent variables,

$$Var(\hat{\beta}_j) = \frac{\sigma^2}{SST_j(1-R^2_j)},$$

for $j = 1, 2, ..., k,$ where $SST_j = \sum^{n}_{i=1}(x_{ij}-\bar{x}_j)^2$ is the total sample variation in $x_j,$ and $R^2_j$ is the R-squared from regressing $x_j$ on all other independent variables (and include an intercept).

Based on the formula of the variance (should be, more precisely, entries of the variance-covariance matrix) and the definition of $R^2,$

$$R^2_j = \frac{SSE_j}{SST_j} \Big(=\frac{\sum^n_{i=1}(\hat{x}_{ij} - \bar{x}_j)^2}{\sum^n_{i=1}(x_{ij} - \bar{x}_j)^2} \Big) = 1 - \frac{SSR_j}{SST_j} \Big(= 1- \frac{\sum^n_{i=1}(x_{ij} - \hat{x}_{ij})^2}{\sum^n_{i=1}(x_{ij} - \bar{x}_j)^2}\Big),$$

is it right that

$$Var(\hat{\beta}_j) = \frac{\sigma^2}{SST_j(1-R^2_j)} = \frac{\sigma^2}{SST_j\Big(1-\frac{SSE_j}{SST_j}\Big)} = \frac{\sigma^2}{SST_j - SSE_j} = \frac{\sigma^2}{SSR_j} ?$$

Thanks!