Show that MLE of $\lambda = \frac{n-T_n}{S_n+cT_n}$

114 Views Asked by At

$X_i$ are i.i.d exponential, mean $\lambda^{-1}$ for $1 \leq i \leq n$ and, the values are measured such that $X_i = c$ if $X_i \geq c$ and $X_i$ otherwise. Show that MLE of $\lambda = \frac{n-T_n}{S_n+cT_n}$ where $S_n= \sum_{j=1}^n X_jI(X_j < c)$ and $T_n = \sum_{j=1}^n I(X_j \geq c)$

Attempt:

The likelihood function is given by $L(\lambda | x_i, x_2, \ldots x_n) = \lambda^n e^{-\lambda \sum x_i} = \lambda^n e^{-\lambda(S_n+cT_n)}$ $\implies$ $log L = n log \lambda -\lambda(S_n+cT_n) $

and hence $\hat\lambda=\frac{n}{S_n+cT_n}$ which is different from the expected answer.

2

There are 2 best solutions below

0
On BEST ANSWER

Indicator variables to the rescue.

Few observations:

$P(X_i < x) = 1 - \lambda e^{-x}$ if $ 0 \leq x < c$

$P(X_i = c) = P(X_i \geq c) = e^{-\lambda c}$ (This one looks counter intuitive at first look, but that is what the $X_i=c$ if $X_i \geq c$ returns)

Now,

$\begin{align}L( \lambda| x_i) &= \prod_{i=1}^{n} (\lambda e^{-\lambda x_i}) I(0 \leq x_i < c) \times (e^{-\lambda c})I( x_i \geq c) \\&= \lambda^{\sum_{i=1}^n I(0 \leq x_i < c)}e^{-\lambda(\sum_{i=1}^n x_iI(0 \leq x_i < c)} \times e^{-\lambda c \sum_{i=1}^n I(x_i \geq c)} \\&= \lambda^{n-T_n}e^{-\lambda S_n} \times e^{-\lambda c T_n}\end{align}$

$\log L= (n-T_n) \log \lambda -\lambda(S_n+cT_n)$

$\frac{\partial \log L}{\partial \lambda} = \frac{n-T_n}{\lambda}-(S_n+cT_n)$

which gives $\hat \lambda = \frac{n-T_n}{S_n+cT_n}$

2
On

I tired it a bit, but did not get a neat MLE like yours. In any case, I thought to post my attempt.

We observe $y_i=\min\{c,X_i\}$ which is again iid. I think since we do not observe $X_i$ we can not plug in its pdf in the likelihood device, rather the pdf of $y_i$, which is what we observe. The pdf of $y_i$ is $$f_i(y)=\lambda e^{-\lambda y}(1-e^{-\lambda c})^{-1}, 0\leq y\leq c$$ now we can write the likelihood function \begin{align} L(\lambda)&=\prod_{i=1}^{n}f_i(y_i)\\ &=\lambda^{n}(1-e^{-\lambda c})^{-n}\prod_{i=1}^{n}e^{-\lambda y_i}\\ &=\lambda^{n}(1-e^{-\lambda c})^{-n}\prod_{i=1}^{n}e^{-\lambda X_iI(X_i<c)}\prod_{i=1}^{n}e^{-\lambda cI(X_i\geq c)}\\ \end{align} and the log likelihood \begin{align} l(\lambda)&=n\log(\lambda)-n\log(1-e^{-\lambda c})-\lambda c\sum_{i=1}^{n}I(X_i\geq c)-\lambda \sum_{i=1}^{n}X_iI(X_i<c)\\ \end{align}

\begin{align} \frac{d}{d\lambda}l(\lambda)&=\frac{nc}{1-e^{c\lambda}}-S_n-cT_n+\frac{n}{\lambda}\\ \end{align}

Note that $$\lim_{\lambda \to 0}=-S_n+\frac{c}{2}(n-2T_n)$$ and $$\lim_{\lambda \to \infty}=-S_n-cT_n)$$

therefore a solution exists and will be unique (by monotonicity) if $-S_n+\frac{c}{2}(n-2T_n)>0$ or that $\bar{y}<\frac{c}{2}$ otherwise this does not have a solution on $\mathbb{R}$.

If a solution exists, then solving $\frac{d}{d\lambda}l(\lambda)=0$, however, seems not to result in your neat MLE!