i'm solving a problem that involve a linear model, and i'm trying to get the distribution of the least square estimator $\beta$.
i found in a book that:
$\widehat{\beta}\sim N_{p}(\beta, (X^{\prime}X)^{-}\sigma^{2})$. but how can i get the distributions to $\beta_{i}$?.
in my case, the matrix $(X^{\prime}X)^{-}$, i found was:
$$ \frac 1 \alpha \begin{bmatrix} \sum_{i=1}^N x_i^2 & -\sum_{i=1}^N x_i \\ & \\ -\sum_{i=1}^N x_i & N\\ \end{bmatrix} $$
where $\alpha= N \sum_{i=1}^N x_i^2- \left[ \sum_{i=1}^N x_i \right]^2$ and i'm looking for the distribution of $\beta_1, \beta_2$.
The result that you found $$ \hat\beta\sim N_p(\beta, (X'X)^{-1}\sigma^2)\tag{*} $$ allows you to read off the solution to your problem. If your model is $$Y_i = \beta_1 + \beta_2x_i+\varepsilon_i,\qquad i=1,\ldots,N, $$ then interpret the assertion (*) to mean that the joint distribution of $(\hat\beta_1,\hat\beta_2)$ is bivariate normal ($p=2$) with mean $(\beta_1,\beta_2)$ and covariance matrix $(X'X)^{-1}\sigma^2$. So the marginal distribution of $\hat\beta_1$ is normal with mean $\beta_1$ and variance equal to the $(1,1)$ entry of the covariance matrix, which is $$ \frac{\sigma^2}{N\sum x_i^2-(\sum x_i)^2}\sum x_i^2=\sigma^2\frac{\sum x_i^2}{N\sum x_i^2-(\sum x_i)^2}.\tag1 $$ Similarly the distribution of $\hat\beta_2$ is normal with mean $\beta_2$ and variance equal to the $(2,2)$ entry of the covariance matrix, which is $$ \frac{\sigma^2}{N\sum x_i^2-(\sum x_i)^2}N=\sigma^2\frac{N}{N\sum x_i^2-(\sum x_i)^2}.\tag2 $$ In (1) and (2), the denominator can be rewritten $N\sum(x_i-\bar x)^2$.
If your model is $$ Y_i = \theta x_i + \varepsilon_i,\qquad i=1,\ldots,N, $$ then the design matrix $X$ is a column $[x_1, x_2,\ldots,x_N]'$ so $X'X=\sum x_i^2$ and the result (*) says that the estimator $\hat\theta$ is normal with mean $\theta$ and variance $\displaystyle\frac{\sigma^2}{\sum x_i^2}.$