The example is from the Book Hogg Introduction to Mathematical Statistics Page 384, Chapter 7.2 Sufficient Statistics. Please let me know if my argument for the solution is correct, since I used a different one then presented in the book.
Problem:
Let $X_{(1)} < X_{(2)} < \dots < X_{(n)}$ denote the oder statistics of a random sample of sice n from the distribution with pdf:
$$ f(x;\theta) = exp(-(x-\theta)) \mathbb{I}_{[\theta, \infty)} $$
Solution:
By factorization theorem: $L(X;\theta) = g(T(X), \theta)h(x)$:
\begin{align*} L(x;\theta) &= \prod_{i=1}^nexp(-(x_i-\theta)) \mathbb{I}\{\theta < x_i < \infty\} \\ &= exp(-(\sum_{i=1}^nx_i- n\theta)) \cdot \mathbb{I}\{\theta < {x_i}_{(i=1,\dots,n)} < \infty\} \\ &= \mathbb{I}\{max\{X_i\} < \infty\}exp(-\sum_{i=1}^nx_i)\cdot exp(n\theta) \cdot \mathbb{I}\{\theta < min\{X_i\}\} \end{align*}
By factorization theorem:
$$ h(x) = \mathbb{I}\{max\{X_i\} < \infty\}exp(-\sum_{i=1}^nx_i) $$
and: $$ g(T(X), \theta) = exp(n\theta) \cdot \mathbb{I}\{\theta < min\{X_i\}\} \Rightarrow T(X) = min(X) $$
Yes correct, but in a very simple way
$$f_X(x|\theta)=e^{\theta}e^{-x}\cdot\mathbb{1}_{[\theta;\infty)}(x)$$
Now simple observing that (here $X_1,...,X_n$ are not ordered...)
$\theta \leq X_1<\infty$
$\theta \leq X_2< \infty$
...
$\theta \leq X_n< \infty$
thus it is self evident that
$$\theta\leq X_{(1)}$$
Thus the likelihood is
$$L(\theta)=\underbrace{e^{-\Sigma_iX_i}}_{h(\mathbf{x})}\cdot\underbrace{ e^{n\theta}\cdot\mathbb{1}_{(-\infty;x_{(1)}]}(\theta)}_{g[t(\mathbf{x}),\theta]}$$
thus $T=X_{(1)}$
I edited your question because the order satistic is always written as $X_{(i)}$