Maximum Likelihood Estimate with different parameters

131 Views Asked by At

Suppose that X and Y are independent Poisson distributed values with means $\theta$ and $2\theta$, respectively. Consider the combined estimator of $\theta$ $$ \tilde{\theta} = k_1 X + k_2 Y $$ where $k_1$ and $k_2$ are arbitrary constants.

  1. Find the condition on $k_1$ and $k_2$ such that $\tilde{\theta}$ is an unbiased estimator of $\theta$.

  2. For $\tilde{\theta}$ unbiased, show that the variance of the estimator is minimized by taking $k_1 = 1/3$ and $k_2 = 1/3$.

  3. Given observations $x$ and $y$ find the maximum likelihood estimate of $\theta$ and hence show that $\tilde{\theta}$ is also the maximum likelihood estimator.

I have gotten (1) and (2) okay, but it's (3) I am having trouble with, I'd be okay if $X$ and $Y$ had the same parameter but I'm having trouble with $X$ and $Y$ having different parameters, any help would be appreciated.

NOTE

For (1) I got $k_1 = 1 - 2k_2$.

For (2) I found the variance of $\tilde{\theta}$, then differentiated and let equal to zero to minimize - therefore we get (after subbing in $k_2 = 1 - k_1/2 $) $$3k_1-1=0,$$ which when subbing in $1/3$, we see it is minimised.

Thank you.

2

There are 2 best solutions below

0
On BEST ANSWER

Write down the likelihood of observing $x$ and $y$. $$P(X=x, Y=y) = P(X=x) P(Y=y) = e^{-\theta} \frac{\theta^x}{x!} e^{-2\theta} \frac{(2\theta)^y}{y!}.$$ Choose $\theta$ to maximize this quantity; this is your maximum likelihood estimator.

By taking logarithms and ignoring constants, it is equivalent to choose $\theta$ maximizing $-3\theta + (x+y) \log \theta$. Setting the derivative to zero yields $-3 + (x+y)/\theta = 0$ and yields the same estimator you had in (b).

0
On

The likelihood of $\theta$ is given by $$ L(\theta|x,y) = f_{X,Y}(x,y|\theta) = f_X(x|\theta) f_Y(y|\theta) = \frac{\theta^x}{x!} \exp(-\theta)\cdot \frac{(2\theta)^y}{y!} \exp(-2\theta)\propto \theta^{x+y}\cdot\exp(-3\theta) $$ Maximising this quantity, is done by taking the logarithm and its derivative: $$ \frac{\partial}{\partial \theta} \log L(\theta|x,y) = \frac{x+y}{\theta} - 3 $$ Putting this equal to $0$ leads to the maximum likelihood estimator $\hat\theta = \frac{1}{3} x + \frac{1}{3} y$. Of course, you need to check that this indeed maximises the likelihood which follows from the second derivative being negative.