We assume that our observations come from a uniform $(0,\theta)$ distribution. Can you please check my work on the following?
We can derive the distribution function of the maximum of the sample, $Y_n$ for some $t$ as follows:
$$ F_{Y_n} (t)= \begin{cases} 1 \quad t> \theta \\ \left( \frac{t}{\theta} \right)^n \quad 0<t \leq \theta \\ 0 \quad t\leq 0 \end{cases} $$
Now the definition of consistence states for our estimator $Y_n$ to be consistent we require that $\lim_{n\to \infty} P\left[|Y_n - \theta| <\epsilon \right]=1$ for all $\epsilon>0$
Using the above CDF, $P\left[ \theta< Y_n<\epsilon+\theta \right]=1-\left(\frac{\epsilon}{\theta} \right)^n$ The second term goes to zero in the limit so we get $1$.
Is everything alright in the above? Do I need to do anything else? Thanks.
I think you need to revise your CDF from $P\left[ \theta< Y_n<\epsilon+\theta \right]$ to
$P\left[ \theta-\epsilon< Y_n<\theta \right]$ as $\theta$ is the maximum value for any Y, which means your CDF, as written, will always equal 0.
Now, if you are using the maximum, then the above probability is the same as the complement of the probability that, out of N trials, not a single observation falls in this range. The probability that a single observation falls within the range is $\frac{\epsilon}{\theta}$ so the probability that it does not fall in that range is $1-\frac{\epsilon}{\theta}$. Therefore, the probability that the maximum value of N observations is less than $\theta - \epsilon$ is $(1-\frac{\epsilon}{\theta})^N$, since if none of the observations fall in this range, it implies that the maximum is less than the lower bound of the range.
The above quantity does indeed go to zero as $N\rightarrow \infty$ for all $\epsilon >0$ so the complement of it goes to 1, as requried.
you were pretty close, I think just a couple notational errors happened.