Is it correct to say that an estimator $\hat\theta_n$ is consistent for a parameter $\theta$ if:
$$\lim_{n\to\infty}E(\hat\theta_n)=\theta$$ $$\lim_{n\to\infty}V(\hat\theta_n)=0$$
where $n$ is the number of samples used to estimate $\theta$? Or are there situations where these two are not sufficient for consistency?
You can slightly rephrase it and say that if $\lim_{n\to \infty} MSE(\hat{\theta}_n) = 0$, then $\hat{\theta}_n \xrightarrow{p}\theta$. It stems from the fact that convergence in distribution to a constant (MSE of zero implies degenerate asymptotic distribution), implies convergence in probability to the same constant. Since, $MSE = Variance + bias^2$, thus the bias converges to $0$, i.e., $\hat{\theta}_n$ is asymptotically unbiased estimator.