How to find the MLE of these parameters given distribution?

796 Views Asked by At

Let $X$ and $Y$ be independent exponential random variables, with

$$f(x\mid\lambda)=\frac{1}{\lambda}\exp{\left(-\frac{x}{\lambda}\right)},\,x>0\,, \qquad f(y\mid\mu)=\frac{1}{\mu}\exp{\left(-\frac{y}{\mu}\right)},\,y>0$$

We observe $Z$ and $W$ with $Z=\min(X,Y)$, and $W=\begin{cases} 1 &,\text{if }Z=X\\ 0 &,\text{if }Z=Y \end{cases}$

I have obtained the joint distribution of $Z$ and $W$, i.e., $$P(Z \leq z, W=0)=\frac{\lambda}{\mu+\lambda}\left[1-\exp{\left(-\left(\frac{1}{\mu}+\frac{1}{\lambda}\right)z\right)}\right]$$

$$P(Z \leq z, W=1)=\frac{\mu}{\mu+\lambda}\left[1-\exp{\left(-\left(\frac{1}{\mu}+\frac{1}{\lambda}\right)z\right)}\right]$$

Now assume that $(Z_i,W_i),i=1,\cdots,n$, are $n$ i.i.d observations. Find the MLEs of $\lambda$ and $\mu$.

(This is the exercise 7.14 of the book Statistical Inference 2nd edition, but no solution given)

2

There are 2 best solutions below

3
On

The $W=0$ and the $W=1$ case can be combined by writing $\lambda^{1-W}\mu^W$.

First differentiating w.r.t. $z$ and forming the product the likelihood can be written as $\prod_i^n{\frac{1}{\lambda^{w_i}\mu^{1-w_i}}}e^{-(\frac{1}{\lambda}+\frac{1}{\mu})z_i}$. Taking logs gives the log likelihood $=-\sum_i^n(w_i\ln{\lambda}+(1-w_i)\ln{\mu}+(\frac{1}{\lambda}+\frac{1}{\mu})z_i)$.

Maximising wrt $\lambda$ and $\mu$ gives $\lambda=\frac{\bar{z}}{\bar{w}}$ and $\mu=\frac{\bar{z}}{(1-\bar{w})}$

Note if $\lambda=\mu$ then $\bar{w}\approx\frac{1}{2}$ so the estimates for $\lambda$ and $\mu$ become equal.

3
On

Note that $Z$ and $W$ are in fact independent, with $W\sim \mathsf{Ber}\left(\frac{\mu}{\lambda+\mu}\right)$ and $Z\sim \mathsf{Exp}$ with rate $(\frac1\lambda+\frac1\mu)$. Therefore for ${z>0\,,\,w\in\{0,1\}}$, we can write the likelihood function based on $(z,w)$ as

\begin{align} L(\lambda,\mu)&=P(W=w)f_Z(z) \\&=\frac{1}{\lambda^w \mu^{1-w}}\exp\left[-\left(\frac{1}{\lambda}+\frac{1}{\mu}\right)z\right]\quad,\,(\lambda,\mu)\in \mathbb R^+\times\mathbb R^+. \end{align}

So the likelihood given the sample $(z_1,w_1),\ldots,(z_n,w_n)$ is

$$L^*(\lambda,\mu)=\frac{1}{\lambda^{\sum_{i=1}^n w_i}\mu^{n-\sum_{i=1}^n w_i}}\exp\left[-\left(\frac{1}{\lambda}+\frac{1}{\mu}\right)\sum_{i=1}^n z_i\right]$$

Log-likelihood is

$$\ell(\lambda,\mu)=-\sum_{i=1}^n w_i\ln\lambda-\left(n-\sum_{i=1}^n w_i\right)\ln\mu--\left(\frac{1}{\lambda}+\frac{1}{\mu}\right)\sum_{i=1}^n z_i$$

For $0<\bar w<1$, solving for the stationary points of $\ell(\lambda,\mu)$ yields $$\hat\lambda=\frac{\sum_{i=1}^n z_i}{\sum_{i=1}^n w_i}=\frac{\bar z}{\bar w}\qquad,\qquad \hat\mu=\frac{\sum_{i=1}^n z_i}{n-\sum_{i=1}^n w_i}=\frac{\bar z}{1-\bar w}$$

So assuming $0<\bar w<1$, the unique MLE of $(\lambda,\mu)$ is $(\hat\lambda,\hat\mu)$.

But when $\bar w\in\{0,1\}$, the MLE does not exist.