Consider a linear model in a matrix form $Y = X\beta + \epsilon$ where $Y$ is a response vector, $X$ is a $n$ by $p$ ($p < n$) full rank matrix of predictors, $\beta$ is a parameter vector, and $\epsilon$ is an error vector whose components are i.i.d. with a variance of $\sigma^2 > 0$.
(a) Derive the variance-covariance matrix of $\hat{\beta}$, the least square estimator of $\beta$.
(b) Let $X$ be given by the matrix
\begin{bmatrix} 1 & x_1\\ 1 & x_2\\ . & .\\ . & .\\ . & .\\ 1 & x_n \end{bmatrix}
Determine the necessary and sufficient condition for the two components of $\hat{\beta}$ being negatively correlated.
Here are my thoughts so far:
Part (a) has been asked on this network quite a bit, so sparing the details, one can work out that the variance-covariance matrix of $\hat{\beta}$ is given by $(X'X)^{-1} \cdot \sigma^2$ (where the prime symbol here represents the transpose).
Now, for part (b), let $\beta = (\beta_1, \beta_2)'$, and let $\hat{\beta} = (\hat{\beta_1}, \hat{\beta_2})'$ be the LSE of $\beta$. Then we have covariance matrix given by
$(X'X)^{-1} \cdot \sigma^2 = \begin{bmatrix} n & \sum_{i = 1}^nx_i\\ \sum_{i = 1}^nx_i & \sum_{i = 1}^n x_i^2 \end{bmatrix}\sigma^2 = \begin{bmatrix} Var(\hat{\beta_1}) & Cov(\hat{\beta_1} \hat{\beta_2})\\ Cov(\hat{\beta_1} \hat{\beta_2}) & Var(\hat{\beta_2}) \end{bmatrix}\sigma^2$
Now, I believe that the two components $\hat{\beta_1}$ and $\hat{\beta_2}$ of $\hat{\beta}$ are negatively correlated if and only if $Cov(\hat{\beta_1}, \hat{\beta_2})$ is negative. Following this thought, since $\sigma^2$ is positive, this means that $\hat{\beta_1}$ and $\hat{\beta_2}$ are negatively correlated in our case if and only if $\sum_{i = 1}^n x_i < 0$. Thus, the necessary and sufficient condition for the two components of $\hat{\beta}$ being negatively correlated is $\sum_{i = 1}^n x_i < 0$.
Is this a correct solution ? Or can I take this further to express a more desirable condition for my solution ?
Thanks for your time !