Assuming I have the following linear regression set-up:
$y_i = \alpha + x_i * \beta + \epsilon_i$
for $i = 1,2,..., n$.
When I run the regession, I get a $\beta$ and $\alpha$ estimates, along with their standard errors. Let $\sigma_{\alpha}$ and $\sigma_{\beta}$ be the standard error of $\alpha$ and $\beta$ respectively.
If I want to compute the standard error of the expression $\hat{\alpha} + x_i * {\hat\beta}$ for each value of $i$, would that be:
$\sqrt{\sigma^2_{\alpha} + x^2_i * \sigma^2_{\beta}}$ ???
Any help would be appreciated! Thanks!
Perhaps calculating this variance will help
\begin{align} var(\hat{\alpha} + x_i \hat{\beta})&= var(\hat{\alpha})+x_i^2 var(\hat{\beta}) + 2cov(\hat{\alpha},\hat{\beta}x_i)\\ &= var(\hat{\alpha})+x_i^2 var(\hat{\beta}) + 2x_icov(\hat{\alpha},\hat{\beta})\\ &= var(\hat{\alpha})+x_i^2 var(\hat{\beta}) - 2x_i \frac{\sigma^2\bar{x}}{S_{xx}} \end{align}
where $\sigma^2=var(\epsilon_i)$. The last line comes from here. If $\sigma^2$ is unknown, an unbiased estimator for it is
$$\hat{\sigma}^2 = (S_{yy} − \beta S_{xy})/(n − 2)$$
Sanity check: Note that the covariance term is negative. This makes sense. In simple linear regression the point $(\bar{x},\bar{y})$ is always on the regression line. Now imagine increasing the slope, but fixing $(\bar{x},\bar{y})$, in this case the y intercept would decrease. Therefore, it makes sense that the estimate of the slope and the estimate of the intercept would be negatively correlated.