Maximum-likelihood estimator of set of data from Normal Distributions

305 Views Asked by At

I have -before- found the MLE of the two parameters of a Normal Distribution but I don't have any idea about how to proceed in this case.

Problem

A sample of size $n$ is drawn from each of four normal populations, all of which have the same variance $\sigma^2$. The means of the four populations are $a + b + c$, $a + b - c$, $a - b + c$, and $a - b - c$.

What are the maximum-likelihood estimators of $a, b, c$, and $\sigma^2$?

(The sample observations may be denoted by $X_{ij}$, $i = 1, 2, 3,4$ and $j = 1,2, ... , n$.)

2

There are 2 best solutions below

2
On BEST ANSWER

Here, the likelihood function is $$L(\theta; X_{ij}) = L(\theta; X_{11}, X_{12}, … , X_{1n}, ~ X_{21}, X_{22}, … , X_{2n}, ~X_{31},X_{32}, …, X_{3n}, ~X_{41},, X_{42}, … , X_{4n})$$ where $\theta$ is the variable that you want to minimize w.r.t.

Since the $X_{ij}$'s are iid, and since minimizing the log of the function will yield the same result, we want to minimize $$\ln L(\theta; X_{ij}) = \ln \prod_{ij} p(X_k;\theta)$$ where $p(X_i;\theta)$ is the distribution of each of the $X_{ij}$'s. So we have

\begin{align*} \ln L(\theta; X_{ij}) & = \ln \prod_{ij} p(X_k;\theta) \\ & = \sum_{ij} \ln{p(X_k;\theta)} \\ & = \sum_{i=1}^n \ln{\frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(X_{1i}-(a+b+c))^2}{2\sigma^2}}} +\sum_{i=1}^n \ln{\frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(X_2i-(a+b-c))^2}{2\sigma^2}}} +\sum_{i=1}^n \ln{\frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(X_3i-(a-b+c))^2}{2\sigma^2}}} +\sum_{i=1}^n \ln{\frac{1}{\sqrt{2\pi\sigma^2}}e^{-\frac{(X_4i-(a-b-c))^2}{2\sigma^2}}} \\ & = \sum_{i=1}^n \left[ \ln{\frac{1}{\sqrt{2\pi\sigma^2}} + \ln e^{-\frac{(X_{1i}-(a+b+c))^2}{2\sigma^2}}}\right] + \text{similarly for the other 3 sums} \\ & = n \ln{\frac{1}{\sqrt{2\pi\sigma^2}}} - \sum_{i-1}^n \frac{[X_{1i}-(a+b+c)]^2}{2\sigma^2} + \text{similarly for the other 3 sums} \\ \end{align*}

Lastly you want to minimize this (by taking the derivative and setting to equal to zero) w.r.t. each of the variables at a time to get their respective MLE's. For instance,

\begin{align*} \frac{\mathrm{d} }{\mathrm{d} a}\ln L(a; X_{ij}) & = \sum_{i=1}^n \frac{[X_{1i}-(a+b+c)]}{\sigma^2} +\sum_{i=1}^n \frac{[X_{2i}-(a+b-c)]}{\sigma ^2} +\sum_{i=1}^n \frac{[X_{3i}-(a-b+c)]}{\sigma ^2} + \sum_{i=1}^n \frac{[X_{4i}-(a-b-c)]}{\sigma ^2}\\ & = \frac{n}{\sigma ^2}(\bar{X_1}-(a+b+c)) + \frac{n}{\sigma ^2}(\bar{X_2}-(a+b-c)) +\frac{n}{\sigma ^2}(\bar{X_3}-(a-b+c)) + \frac{n}{\sigma ^2}(\bar{X_4}-(a-b-c)) \\ & = \frac{n}{\sigma ^2} \left[\bar{X_1}+\bar{X_2}+\bar{X_3}+\bar{X_4} -4a \right] \end{align*}

So the MLE of a is when this is equal to zero: $\frac{n}{\sigma ^2} \left[\bar{X_1}+\bar{X_2}+\bar{X_3}+\bar{X_4} -4a \right] = 0 \implies a = \frac {\bar{X_1}+\bar{X_2}+\bar{X_3}+\bar{X_4}}{4}.$

0
On

Hint: Notice that the distribution is perfectly symmetric around the value $a$. That means the maximum-likelihood estimator of $a$ will be simply the mean of the data.

Can you continue?

enter image description here