$\hat{\beta_1}$ is an OLS estimator for parameter $\beta_1$: $Y_i=\beta_0+\beta_1 X_i+\epsilon_i$,
So $\hat{\beta_1}=\frac{\sum(X_i-\bar{X})(Y_i-\bar{Y})}{\sum(X_i-\bar{X})^2}$ and $\operatorname{Var}(\epsilon_i)=\sigma^2$
$$\operatorname{Var}(\hat{\beta_1})=\operatorname{Var}(\frac{\sum(X_i-\bar{X})(Y_i-\bar{Y})}{\sum(X_i-\bar{X})^2})=[\frac{\sum(X_i-\bar{X})}{\sum(X_i-\bar{X})^2}]^2\operatorname{Var}(Y_i-\bar{Y})=\frac{\sigma^2}{\sum(X_i-\bar{X})^2}$$
I don't understand why the third equality holds.
$\sum(X_i-\bar{X})=0$ since $\sum X_i=n \bar{X}$, doesn't it?
$\newcommand{\var}{\operatorname{var}} \newcommand{\cov}{\operatorname{cov}}$ Your second equality is incorrect. It treats the bound variable $i$, which should run from $1$ through $n$, as if it's a free variable, which then must have a value.
Observe that \begin{align} & \var\left(\frac{\sum_i (X_i-\bar X) (Y_i-\bar Y)}{\sum_i (X_i-\bar X)^2}\right) = \var\left(\frac{\sum_j (X_j-\bar X) (Y_j-\bar Y)}{\sum_i (X_i-\bar X)^2}\right) \\[10pt] = {} & \var\left( \sum_j \frac{(X_j - \bar X)(Y_j-\bar Y)}{\sum_i (X_i-\bar X)^2} \right) \\[10pt] = {} & \frac{\left(\sum_j (X_j-\bar X)^2 \var(Y_j-\bar Y)\right) + 2\left(\sum_{j,k\,:\,j<k} \cov(Y_j-\bar Y,Y_k-\bar Y)\right)}{\left( \sum_i (X_i-\bar X)^2 \right)^2}. \end{align}
Working this out is not hard but is a bit cumbersome. You need to find $\cov(Y_j,\bar Y)$ and the other covariance, relying on the fact that $Y_j$ and $Y_k$ are both correlated with $\bar Y$ and on the variance of $\bar Y$.
I prefer to use matrix methods.