The linear model that I'm working with is: $$y_t =α +βx_t + ε_t$$
Based on my Lecture I have:
$$Var(\hatβ) = Var(Σw_tε_t)$$ where $ε_t$ is the error term and $$w_t = \frac{x_t-\overline x}{Σ(x_t-\overline x)^2}$$
That being said we then have:
$$\begin{align}Var(\hatβ)& = ΣVar(w_tε_t)+ΣΣCov(w_s,ε_t)\\&=E[w_tε_t - E(w_tε_t)]^2\\ &= E[w_tε_t]^2\\&= E[w_t^2ε_t^2]\\&= Σw_t^2Var(ε_t)\\&=σ^2Σw_t^2 \end{align}$$
What bothers me the most here is how come: $$Var(Σw_tε_t) = ΣVar(w_tε_t)+ΣΣCov(w_s,ε_t)$$
and how do we get from this: $$E[w_t^2ε_t^2]$$ to this: $$Σw_t^2Var(ε_t)$$
Many thanks in advance!
If you're not assuming that the $x_t$ values are random, and if you assume no serial correlation and homoskedasticty (that is $$\text{var}(e_t)=\sigma^2$$ and $$\text{cov}(e_t,e_s)=0,\quad t\neq s$$ for any $t$ and $s$), then $$\text{var}(\hat \beta)=\text{var}\left(\sum w_t e_t\right)=\sum w_t^2 \text{var}(e_t)=\sigma^2\sum w_t^2.$$ and you can check that this amounts to $$\text{var}(\hat\beta)=\frac{\sigma^2}{\sum(x_t-\bar x)^2}.$$
The formula with the covariance you give is just wrong. It is true although that $$\text{var}\left(\sum w_t e_t\right)=\sum \text{var}(w_t e_t)+\sum_{t\neq s} \text{cov}(w_te_t,w_s e_s)=$$ $$=\sum w_t^2\text{var}(e_t)+\sum_{t\neq s} w_t w_s\text{cov}(e_t,e_s),$$ but our assumptions imply that the second term is zero anyway.