How do you calculate the correlation between the intercept's and beta's standard error in a univariate linear regression?

102 Views Asked by At

I am running a regression to predict a variable Y as follows:

$Y=\alpha+\beta\times x+\epsilon$

I am trying to get a distribution of the expected value of Y given standard errors in the model estimate. To do this, I am planning to make random draws of alpha and beta, normally distributed with mean equal to their estimates and standard deviation equal to their standard error, and feed an X value into this model. The output will generate the distribution of Y. However, I don't know how to get the correlation between $\alpha$ and $\beta$, which I can feed into a Cholesky decomposition to create correlated random draws. How do I find this correlation? Also, does this method make sense?

Thanks!

1

There are 1 best solutions below

0
On BEST ANSWER

In OLS models the covariance matrix of the coefficients is given by $\sigma^2 (X'X)^{-1}$. So, in your case, $$ X = \begin{pmatrix} 1 & x_1 \\ : & :\\1 & x_n \end{pmatrix} $$ hence, $$ (X'X)^{-1} = \frac{1}{n\sum_{i=1}^n(x_i - \bar{x}_n)^2}\begin{pmatrix} \sum_{i=1}^nx^2_i & -\sum_{i=1}^nx_i \\-\sum_{i=1}^nx_i & n \end{pmatrix}, $$ and $\hat{\sigma}^2=\frac{1}{n-2}\sum_{i=1}^n(\hat{y}_i - y_i)^2.$ Hence, to get the estimated correlation you can use the estimated covariance of $\alpha$ and $\beta$ which is given in the secondary diagonal of $\hat{\sigma}^2(X'X)^{-1}$, and the SE which is the square roots of the main diagonal terms respectively.

I'm not sure that your case is exactly the standard one, but I hope it gives some direction to follow.