I am quite stumped by the following problem. The usual log-likelihood route with differentiation doesn't work. The problem is as follows:
Let $X_1,...,X_n$ be i.i.d. random variables from a distribution with p.d.f.:
$$f(x; \theta_1, \theta_2) = \frac{1}{\theta_2} \text{exp}(-\frac{(x - \theta_1)}{\theta_2}) \: \text{for } x \geq \theta_1$$
with parameters $\theta_1 \in R$ and $\theta_2 > 0$.
Find maximum likelihood estimators of $\theta_1$ and $\theta_2$
Any hints or insights?
What I tried:

I have verified that the condition
$$\int_{\theta_1}^\infty f(x; \theta_1, \theta_2) \, dx=1$$
is safisfied.
To maximize $L(\theta_1, \theta_2; x)$, for each $\theta_2$ that is fixed, we want $\theta_1$ to be as large as possible. However, there is this constraint of $x \ge \theta_1$.
Hence, we choose $\hat{\theta}_1 = \min_i x_i$.
To solve the problem, just solve for $\frac{\partial L}{\partial \theta_2}=0$.
$$-\frac{n}{\hat{\theta_2}}+\frac1{\hat{\theta}_2^2}\sum_{i=1}^n x_i - \frac{n\hat{\theta}_1}{\hat{\theta}_2}=0$$
$$\frac1{\hat{\theta}_2^2}\sum_{i=1}^n x_i = \frac{n(\hat{\theta}_1+1)}{\hat{\theta}_2}$$
$$\hat{\theta}_2=\frac{\bar{x}}{\hat{\theta}_1+1}$$