Maximum Likelihood and method of moment estimation

115 Views Asked by At

enter image description here

For the above question we can find the the log likelihood function and differentiate it with the parameter and equate it to zero and solve it.

But if I do that I get the answer as zero.

What is the correct way to solve this, is there any other way to find maximum likelihood estimate for the parameter?

3

There are 3 best solutions below

0
On BEST ANSWER

Maximum Likelihood Estimator of Theta

The maximum Likelihood function is $$L(\theta) = \underbrace{1}_{0.2} \cdot \underbrace{\frac{1}{2 \theta - 1}}_{1.2} \cdot \underbrace{\frac{1}{2 \theta - 1}}_{1.4} \cdot \underbrace{1}_{0.3} \cdot \underbrace{\frac{1}{2 \theta - 1}}_{0.9} \cdot \underbrace{1}_{0.7} = \left[ \frac{1}{2 \theta - 1} \right]^3$$ $$ \text{Log}(L(\theta)) = 3* Log\bigg( \frac{1}{2 \theta - 1}\bigg)$$ We differentiate the Log likelihood function with respect to $\theta$ and equate it to $0$ we get $$\theta = \frac{1}{2} $$ Which is not possible as $\theta > \frac{1}{2} $, So we directly try to maximize the likelihood function. To maximize the likelihood function we need to find the minimum value for $\theta$

It is given that $$ \frac{1}{2}<X<\theta$$ Let $X_{(1)},X_{(2)},X_{(3)},X_{(4)},X_{(5)},X_{(6)}.............X_{(n)}$ Order statistics such that

$$ X_{(1)}<X_{(2)}<X_{(3)}<X_{(4)}.......<X_{(n)} $$ Then, $$ \frac{1}{2}<X_{(1)}<X_{(2)}<X_{(3)}<X_{(4)}.......<X_{(n)} <\theta$$ The minimum value $\theta$ can have is $ X_{(n)} $that is $$\theta_{MLE} = X_{(n)}\space \text{or } \theta_{MLE} = \text{Max}(X_1,X_2,X_3,X_4,X_5,X_6..X_n)$$ $$ \theta_{MLE} = \text{Max}(0.2,1.2,1.4,0.3,0.9,0.7)$$ $$\theta_{MLE} = 1.4$$

The MLE of the $\theta$ is 1.4.

For Moment Estimator of $\theta$ we need to equate sample mean to Expected value of the function $$ \frac{\sum_{i=1} ^n Xi}{n} = E(X)$$ $$E(X)=\int_0^\frac{1}{2} x dx+\int_\frac{1}{2}^\theta \dfrac{x}{2\theta-1} dx$$ $$ E(X)= \frac{1}{8} + \frac{2\theta +1}{8} $$ $$E(X)= \frac{2\theta +2}{8} = \frac{\theta +1}{4}$$

Sample mean is $$ \frac{\sum_{i=1} ^n Xi}{n} = \frac{(0.2+1.2+1.4+0.3+0.9+0.7)}{6} = \frac{4.7}{6}$$

Equating sample mean and expected mean we get, $$ \frac{4.7}{6} = \frac{\theta +1}{4}$$ $$ \theta = 2.133$$

The moments estimate of $\theta$ is 2.133

0
On

The reason you usually differentiate the log likelihood function is that it's typically the easiest way to do what you're really trying to do, which is to maximize the likelihood function. So, let's go back to the drawing board and just do what we really want to do -- find $\theta$ that maximizes the likelihood function -- but more directly. The likelihood function, evaluated at the sample data, is $$L(\theta) = \underbrace{1}_{0.2} \cdot \underbrace{\frac{1}{2 \theta - 1}}_{1.2} \cdot \underbrace{\frac{1}{2 \theta - 1}}_{1.4} \cdot \underbrace{1}_{0.3} \cdot \underbrace{\frac{1}{2 \theta - 1}}_{0.9} \cdot \underbrace{1}_{0.7} = \left[ \frac{1}{2 \theta - 1} \right]^3$$ if $\color{blue}{\text{$\theta$ is at least as big as $1.4$}}$. (Note that $L(\theta) = 0$ if $\theta < 1.4$.) So, the question is: for what value of $\theta$ is this as large as possible, subject to the $\color{blue}{\text{constraint}}$? (No derivative necessary -- just think, and pay attention to the constraint.)

0
On

The maximum likelihood estimator is $\hat \theta=\max(X_1,X_2,\ldots,X_n)$ as @Aaron Montgomery has correctly shown in his answer.

As for the method of moments estimator, notice that $$E(X)=\int_0^\frac{1}{2} x dx+\int_\frac{1}{2}^\theta \dfrac{x}{2\theta-1} dx=\dfrac{4\theta+3}{16}\ \ \Rightarrow\ \ \theta=\dfrac{16 E(X)-3}{4}.$$ Therefore, we can use: $\hat \theta=\dfrac{16 \bar X_n-3}{4}$ as the method of moments estimator.