Maximum Likelihood Estimators of multivariable parameter.

55 Views Asked by At

We need to find MLE of $\alpha$, $\beta$ in a pdf which is zero everywhere except, $$f(x)=\beta e^{-\beta(x-\alpha)}$$ where $x\geq\alpha$, $\beta>0$.My approach was usual, I figured for optimising L, we get, $$\bar{x}=\alpha$$ But that is blowing up the $\beta$ parameter's MLE. Could some one guide here?

Edit: Thanks for the suggestions, I realised that $\alpha=min\{x_i\}_{\forall i}$

2

There are 2 best solutions below

0
On

Problems like these are tricky because it goes beyond a simple optimization problem. One has to reason about one of the parameters.

Assuming you're drawing $N$ i.i.d samples the log likelihood is:

$ \frac{\mathcal{l}(\alpha,\beta |x_1,\cdots,x_N)}{N} = N \ln(\beta) -\beta \bar{X} + \alpha $

Where $\bar{X}$ is the empirical mean of the data. The tricky part comes in seeing that to maximize the likelihood you want to maximize $\alpha$. However, it can't be infinity since this is inconsistent with observations. Therefore, to maximize $\alpha$, and to be consistent,you have $\alpha = \text{min}(x_i)$. Finally, $\beta$ can be optimized using standard calculus to give $\beta = 1/\bar{X}$.

0
On

Let $\mathbf{x} = (x_1, \ldots x_n) $ be a random sample with each $X_i \sim f(x_i) = \beta e^{-\beta(x_i-\alpha)}$. Then the likelihood function is

\begin{align} L(\alpha, \, \beta \, | \, \mathbf{x} ) &= \prod \beta e^{-\beta(x-\alpha)} \\ & = \beta^n \exp\left[-\beta\sum(x_i - \alpha)\right], \end{align}

and so the log likelihood function is $ l(\alpha, \beta \, | \, \mathbf{x}) = n \log \beta - \beta \sum (x_i - \alpha) $.

The partial derivatives are given by

\begin{align} \frac{ \partial l}{\partial \alpha} &= n \beta \\ \frac{ \partial l}{\partial \beta} & = \frac{n}{b} - \sum(x_i - \alpha). \end{align}

Since $\frac{ \partial^2 l}{\partial \beta^2} < 0 $ for all $\beta$, we can find an MLE for $\beta$ by finding where the first derivative is equal to $0$.

However, $\frac{ \partial l}{\partial \alpha} = 0 $ has no solution. In fact, $l$ is strictly increasing with respect to $\alpha$, which suggests you should make $\alpha$ as large as possible. How large can that be? A comment above points out that if you were to set $\alpha = \bar{x}$, you would have some $x_i < \alpha$, which contradicts your requirements. So how high can you set $\alpha$ so that all $x_i \ge \alpha$?

As a final comment: Note that your distribution is a shifted exponential. What does the graph of the usual, unshifted, exponential distribution look like? Most of the probability is piled up at the bottom (or left, if you prefer) end. So if we're trying to see what horizontal shift makes our observations most likely, we want our observations to be as far to the left of our support as possible. In other words, how far to the right can we push our curve? This is the same question as: How large can we make $\alpha$?