Find MLE estimators of PDF

838 Views Asked by At

I am quite stumped by the following problem. The usual log-likelihood route with differentiation doesn't work. The problem is as follows:

Let $X_1,...,X_n$ be i.i.d. random variables from a distribution with p.d.f.:

$$f(x; \theta_1, \theta_2) = \frac{1}{\theta_2} \text{exp}(-\frac{(x - \theta_1)}{\theta_2}) \: \text{for } x \geq \theta_1$$

with parameters $\theta_1 \in R$ and $\theta_2 > 0$.

Find maximum likelihood estimators of $\theta_1$ and $\theta_2$

Any hints or insights?

What I tried:

enter image description here

2

There are 2 best solutions below

4
On BEST ANSWER

I have verified that the condition

$$\int_{\theta_1}^\infty f(x; \theta_1, \theta_2) \, dx=1$$

is safisfied.

To maximize $L(\theta_1, \theta_2; x)$, for each $\theta_2$ that is fixed, we want $\theta_1$ to be as large as possible. However, there is this constraint of $x \ge \theta_1$.

Hence, we choose $\hat{\theta}_1 = \min_i x_i$.

To solve the problem, just solve for $\frac{\partial L}{\partial \theta_2}=0$.

$$-\frac{n}{\hat{\theta_2}}+\frac1{\hat{\theta}_2^2}\sum_{i=1}^n x_i - \frac{n\hat{\theta}_1}{\hat{\theta}_2}=0$$

$$\frac1{\hat{\theta}_2^2}\sum_{i=1}^n x_i = \frac{n(\hat{\theta}_1+1)}{\hat{\theta}_2}$$

$$\hat{\theta}_2=\frac{\bar{x}}{\hat{\theta}_1+1}$$

0
On

This is a nice problem. Let's ignore $\theta_2$ momentarily, considering only $\theta_1$. We basically want to minimize $\sum_{i=1}^n (x_i-\theta_1)$, with the constraint that each $x_i \geq \theta_1$. Of course, the quantity decreases as $\theta_1$ increases, but we have those constraints, so the maximum likelihood occurs for $\theta_1=\min_i x_i$. (Conveniently, completely independent of $\theta_2$!)

For $\theta_2$, noting that $X-\theta_1$ is an exponential distribution should suffice.