Let $X \sim U(0,\theta)$. Using moments for finding an estimator for the $\theta$ parameter, I obtained $\theta=2 \overline X $ which is unbiased.
I want to prove this estimator is consistent. I know that I shall verify:
$$\lim _{n\to \infty} E(\hat\theta_n) = \theta $$
$$\lim _{n\to \infty} V(\hat\theta_n) = 0$$
How can I prove this? Can be properly done without using Chebyshev inequality? I know that $V(\hat \theta)=0$ because it's unbiased, so I assume the limit involving the variance is zero. But how about the limit of the mean of $\ n$ estimators?
Thanks.
You've got the situation reversed. The fact that $\hat \theta_n$ is unbiased means that $\operatorname{E}[\hat \theta_n] = \theta$, but it does not say anything about the variance of the estimator.
The simplest thing to do is actually calculate the variance of $\bar X$, using the fact that the variance of the sum of independent random variables equals the sum of the variances of each random variable. That is to say, $$\operatorname{Var}[X_1 + \cdots + X_n] \overset{\text{ind}}{=} \operatorname{Var}[X_1] + \cdots + \operatorname{Var}[X_n].$$ Since each observation is also identically distributed, the RHS is simply $n \operatorname{Var}[X_1]$, or $n$ times the variance of a single observation. Compute this variance, show it is finite, and then show $$\operatorname{Var}[\hat \theta_n] = \frac{4}{n} \operatorname{Var}[X_1]$$ and then the limit as $n \to \infty$ is zero.
You also have some other errors in your notation. For example, you should write $$\hat \theta_n = 2 \bar X,$$ not $\theta = 2\bar X$, which makes no sense since $\theta$ is a parameter, and $\hat \theta_n$ is an estimator of that parameter.