Maximum Likelihood Estimation of two unknown parameters

1k Views Asked by At

Here is a question:

We have a machine A that functions with probability $\theta_1 \theta_2$, and a machine B that functions with probability $\theta_1 \theta_2^2$. A random sample of $n$ A machines and an independent random sample of $n$ B machines are selected. Of these, $n_1$ and $n_2$ function respectively. Find the MLEs of $\theta_1$ and $\theta_2$.

I have problems understand the question when it says $n_1$ and $n_2$. Does that mean each type of machine samples have $2$ out of $n$ working? and is there any difference we need to draw between Sample A and B since sample B is independent?

Thanks a lot!

2

There are 2 best solutions below

6
On

Your likelihood function looks like this

$$(\theta_1\theta_2)^{n_1}(1-\theta_1\theta_2)^{n-n_1}(\theta_1\theta_2^2)^{n_2}(1-\theta_1\theta_2^2)^{n-n_2}$$

Maximizing log likelihood we can find that

$$\theta_1\theta_2^2= \frac{n_2} n$$

$$\theta_1\theta_2= \frac{n_1} n$$

From here I think we can infer that $\theta_2=n_2/n_1$ and $\theta_1=n_1^2/nn_2$

If someone could check my solution I would appreciate that.

0
On

You have the likelihood function $$ L(\theta_1,\theta_2) = \text{constant} \times (\theta_1\theta_2)^{n_1} (1-\theta_1\theta_2)^{n-n_1} (\theta_1\theta_2^2)^{n_2}(1-\theta_1\theta_2^2)^{n-n_2} \tag 1 $$ where "constant" means not depending on $\theta_1$ or $\theta_2$.

It start by letting $\alpha=\theta_1 \theta_2$ and $\beta = \theta_1 \theta_2^2.$ That transforms $(1)$ to $$ \alpha^{n_1} (1-\alpha)^{n-n_1} \beta^{n_2} (1-\beta)^{n-n_2}. $$ The logarithm of this expression is $$ \ell = n_1 \log \alpha + (n-n_1)\log(1-\alpha) + n_2\log\beta + (n-n_2)\log(1- \beta). $$ So we have $$ \frac{\partial\ell}{\partial\alpha} = \frac{n_1} \alpha - \frac{n-n_1}{1-\alpha}. $$ This is $0$ precisely when $\alpha = \dfrac{n_1}n$. By symmetry we similarly see that $\partial\ell/\partial\beta=0$ when $\beta = \dfrac{n_2} n.$

Given \begin{align} \theta_1\theta_2 & = \alpha = \frac{n_1} n \tag 2 \\[10pt] \text{and } \theta_1\theta_2^2 & = \beta = \frac{n_2} n \tag 3 \end{align} we can divide the left side of $(3)$ by the left side of $(2)$ to get $\theta_2,$ and doing the same with the right sides we get $\theta_2=n_2/n_1.$ We can divide the square of the left side of $(2)$ by the square of the left side of $(3)$ to get $\theta_1$, and then doing the same with the right sides we have $\theta_1 = n_1^2/n_2 n.$

Here we have used what you often see called the "invariance" of maximum-likelihood estimates, but it's really equivariance rather than invariance.

Note also the mere fact of the derivative being $0$ does not prove that there is a global maximum. In this case the function of either $\alpha$ or $\beta$ is $0$ at the two extreme points $0$ and $1$, and is positive between those two extremes, and is continuous. That proves there is a global maximum somewhere strictly between $0$ and $1$. The function is also everywhere differentiable, so the derivative must be $0$ at a non-endpoint maximum. And then we find that there is only one point where the derivative is $0$ and we can conclude that's where the global maximum is.