Show that the $F$ statistic for dropping a single coefficient from a linear regression model is equal to the square of the corresponding $z$-score.
$z$-score is given as $z_i = \frac{\hat{b_i}}{\hat{\sigma}\sqrt{v_i}}$ While the $F$ statistic is given by $F = \frac{(RSS_0-RSS_1)/(p_1-p_0)}{RSS_1/(N-p_1-1)}$ where $RSS_0$ is the residual sum of squares for the smaller model, $RSS_1$-for the bigger one.
Now suppose we drop a single coefficient $b_i$ then,
$RSS_0 = \sum_{j=1_{j\not=i}}^{N}(y_j-\hat{y}_j)^2$ with $p_0=p_1+1$ parameters
$RSS_{1} = \sum_{i=1}^{N}(y_i-\hat{y}_i)^2 = \sum_{j=1_{j\not=i}}^{N}(y_j-\hat{y}_j)^2 + (y_i-\hat{y}_i)^2$
Since, $p_1 = p$ (number of parameters) so, the denominator is $\hat{\sigma}^2 = \frac{1}{N-p-1}\sum_{i=1}^{N}(y_i-\hat{y}_i)^2$
While the numerator gives: $(RSS_0-RSS_1)/(p_1-p_0) = -(RSS_{0}-RSS_{1}) = (y_i-\hat{y}_i)^2$
Now by the hypothesis $b_i = 0$ so,
$y_i = x_i^Tb = x_{i1}b_1 + x_{i2}b_i+...+x_{ii}0+...+x_{ip}b_{p}$
$\hat{y}_i = x_i^T\hat{b} = x_{i1}\hat{b}_1 + x_{i2}\hat{b}_i+...+x_{ii}\hat{b}_i+...+x_{ip}\hat{b}_{p}$
and
$y_i-\hat{y}_i = x_{i1}(b_1-\hat{b}_1)+...+x_{ii}\hat{b}_i+...+x_{ip}(b_p-\hat{b}_p)$
Since, for linear regression by assumption we have $\hat{b} = N(b, \sigma^2(X^TX)^{-1})$
then, each $\hat{b}_i = N(b_i, \sigma^2v_i)$ where $v_i$ is the $i$-th diagonal element of the covariance matrix.
As a result we have that $y-\hat{y}$ is a linear combination of normally distributed random variables. Hence, $y-\hat{y} = N(0, \sigma^2\sum_{i=1}^{p}v_i)$
The square of a normally distributed random variable has a chi-squared distribution:
$\frac{(y_i-\hat{y}_i)^2}{\sigma^2\sum_{i=1}^{p}v_i} = \chi^2_1, (y-\hat{y})^2 = (\sigma^2\sum_{i=1}^{p}v_i) \chi^2_1$
Is there a way to proceed from here?