Suppose I have the standard OLS estimators for the model: $y=a+bx+{\epsilon}$, where both $y$ and $x$ are random variables. I was informed that the unconditional variance and conditional variances of $\hat{b}$ are not the same. I have that $V(\hat{b}|x) = \frac{{\sigma}^2}{S_{xx}}$ and I get the same thing for the unconditional variance by using: $V(\hat{b}) = E(V(\hat{b}|x)) + V(E(\hat{b}|x))$. All of the standard OLS assumptions hold. Where am I going wrong here?
EDIT: Because we condition on $x$, and the estimators are unbiased, it immediately follows that $E(\hat{b}|x)=b$ and $b$ is just a constant, so $V(b)=0$. This just leaves the term for $E(V(\hat{b}|x))$ which should just be $E(\frac{σ^2}{S_{xx}})$. And because we conditioned on $x$ we treat $x$ as "fixed". So this is essentially just the expectation of a constant - so we get that same constant back. I think maybe I don't understand how that expectation is being taken? Can someone confirm that the conditional variance is at least correct (this is essentially straight from the textbook I'm using)?