Problem. Let $X_{1},\ldots,X_{n}$ denote a random sample from the distribution with common pdf $$ f(x;\theta) = e^{-(x-\theta)} 1_{(\theta,+\infty)}(x), \;\; \theta \in \mathbb{R} $$ Let $Y_{n} = \min \{X_{1},\ldots,X_{n}\}$. Is $Y_{n}$ a consistent estimator of $\theta$?
Attempt. Note that if $ \displaystyle \lim_{n \to \infty} \mathbb{P}[|Y_n - \theta| < \epsilon]=1$, then $Y_n$ is a consistent estimator for $\theta$. The CDF of $X_{1},\ldots,X_{n}$ is
\begin{align*} F(x) &= \int_{0}^{x} e^{-(t-\theta)} \, dt \\ &= e^{\theta-x}(e^x-1) && x \in (\theta, \infty) \end{align*} So $F_{Y_n}(x) = [e^{\theta-x}(e^x-1)]^n$. Moreover,
\begin{align*} \lim_{n \to \infty} \mathbb{P}[|Y_n - \theta| < \epsilon] &= \lim_{n \to \infty} \mathbb{P}[\theta - \epsilon < Y_n < \epsilon + \theta] \\ &= \lim_{n \to \infty} [F_{Y_n}(\theta + \epsilon) - F_{Y_n}(\theta - \epsilon)] \\ &= \lim_{n \to \infty} (e^{-\epsilon}(\epsilon^{\theta+\epsilon}-1))^{n} \\ &= \infty \end{align*} So $Y_n$ is not a consistent estimator for $\theta$. Is this correct?
As pointed out by user51547, we cannot have $P[|Y_n - \theta| < \epsilon] \rightarrow \infty$ to there should be a mistake somewhere, and actually it is already a the stage of the computation of the c.d.f. of $X_1$ since with the current expression, we have $\lim_{x\to \infty}F(x)=e^{\theta}$ instead of $1$.
We can proceed as follow: first, since $X_i\geqslant \theta$ almost surely, we derive that $\{\lvert Y_n-\theta\rvert>\varepsilon\}=\{ Y_n>\theta+\varepsilon\}$ hence $$ \mathbb P\{\lvert Y_n-\theta\rvert>\varepsilon\}=\mathbb P\left\{ \min_{1\leqslant i\leqslant n}X_i>\theta+\varepsilon\right\}=\mathbb P\left(\bigcap_{i=1}^n\{X_i>\theta+\varepsilon\}\right). $$ Then use independence and compute $\mathbb P(X_1>\theta+\varepsilon)$.