T-test and F-test in Multiple Linear Regression

321 Views Asked by At

In simple linear regression, $$ y = \beta_0 + \beta_1X_1, $$ the T-test for $\hat{\beta_1}$ is $$ H_0: \beta_1 = \beta_1^0 \quad \text{and} \quad H_A: \beta_1 \neq \beta_1^0, $$ where $\beta_1^0 = 0$, and the F-test is $$ H_0: \beta_1 = 0 \quad \text{and} \quad H_A: \beta_1 \neq 0. $$ We know that the T-statistics is $$ T = \frac{\hat{\beta}_1}{se(\hat{\beta}_1)} \implies t_{n-2} $$ and F-statistics is $$ F = \frac{SSreg}{\frac{RSS}{n-1}} \implies F_{1,n-2}, $$ where $RSS = \sum(y_i-\hat{y}_i)^2$ and $SSreg = \sum(\hat{y}_i - \bar{y})^2$. We know that $$ T^2 = F $$ Problem: I am wondering whether this property still holds in multiple linear regression where predictors are $x_1,x_2,...,x_p$. Since now the case is $$ H_0: \beta_j = 0 \quad \text{and} \quad H_A: \beta_j \neq 0. $$ I don't see why it can work here since I find the degrees of freedom don't match in two tests. So I am wondering how can we show the property still holds in this case analytically?