Let $X_1, X_2,..., X_n$ be i.i.d $U(0, \theta)$ random variables.
I am attempting to prove that $\theta_1=\frac{n+1}{n} Y_n$ is a consistent estimator for $\theta$, where $Y_n=\max(X_1, X_2,\ldots,X_n)$ using the definition of consistency directly.
I am familiar with other techniques but I'm not sure on how to proceed using the definition directly.
I am aware that I have to prove that $P(|\theta_1 - \theta| > \delta)=P(|\frac{n+1}{n}Y_n - \theta| > \delta)< \epsilon$, for all $n>N(\epsilon)$, $\epsilon > 0$.
I'm unsure as to how I can proceed, any help would be greatly appreciated!
\begin{align} \mathbb{P}(|\theta_1(n) - \theta| > \delta) &= \mathbb{P}( \theta - \frac{n+1}{n} Y_n > \delta) + \mathbb{P}(\frac{n+1}{n} Y_n - \theta > \delta) \\ &=\mathbb{P}(Y_n < \frac{ n }{ n+1 } ( \theta - \delta)) + \left(1 - \mathbb{P}(\frac{n+1}{n} Y_n \le \theta + \delta)\right)\\ &= F_{Y_n}\left(\frac{n}{ n + 1 } (\theta-\delta)\right) + (1 - F_{Y_n}\left(\frac{n}{ n + 1 } (\theta+\delta)\right) . \end{align} Now, $$ F_{Y_n}(y)=(F_X(y))^n = ( 1/\theta ) ^ n, $$ hence, \begin{align} \lim_{n \to \infty} \mathbb{P}(|\theta_1(n) - \theta| > \delta) &= \lim_{n \to \infty} \left( \frac{n}{n+1}( 1 - \frac{ \delta }{ \theta } ) \right) ^ n + (1 - \lim_{n \to \infty} F_{Y_n}\left( \frac{n}{n+1}( \theta + \delta ) \right) \\ &= ( 1 - \delta/\theta)^\infty + ( 1 - F_{Y_n}( \theta + \delta))\\ &= 0 + (1-1) = 0. \end{align} The last line is abuse of notation, but the idea is that $n/(n+1)$ goes to $1$ and $\delta/\theta >0$, hence you have a fraction that goes to $0$ as the power goes to infinity. Moreover, $\theta + \delta > \theta$ while the support of $Y_n$ is $[0, \theta]$, thus the CDF is $1$.