Infinite Bias in a Maximum Likelihood Estimator

93 Views Asked by At

I'm having some problems calculating the bias of a ML estimator in the following problem:

Let $\mu$, $x$, $y$ be random variables such that:

$y|x$ is distributed as $\exp(x)$ so that $p(y|x) = x\exp(-xy)$ for $y > 0$

$x|\mu$ is distributed as $\exp(\mu)$ so that $p(x|\mu) = \mu\exp(-\mu x)$ for $x > 0$

In this case, I am interested of the ML estimator of $\mu$ given $y$, defined as:

$\hat{\mu}(y)$ = argmax $p(y|\mu)$

Using Bayes' Rule and some integration, I can find that

$\begin{align} p(y|\mu) &= \int_x p(y|x,\mu)p(x|\mu)\; \mathrm dx \\ &= \int_x p(y|x)p(x|\mu)\; \mathrm dx \\ &= \int_0^\infty x\mu \exp(-x(y+\mu))\; \mathrm dx \\ &= \frac{y}{(y+\mu)^2} \\ \end{align}$

Given that $p(y|\mu)$ achieves its maximum at $\mu = y$, the ML estimator should be $\hat{\mu}(y) = y$.

Unfortunately, however, things start to get weird when I try to calculate the bias in this estimator. Specifically:

$\begin{align}Bias(\hat{\mu}(y)) &= \mathbb{E}_y[\hat{\mu}(y) - \mu] \\ &= \int_y (\hat{\mu}(y) - \mu)p(y|\mu)\; \mathrm dy \\ &= \int_0^\infty (y - \mu)\frac{y}{(y+\mu)^2}\; \mathrm dy \\ &= \infty \\ \end{align}$

This does not really make sense to me. I've checked my work several times, so I'm wondering if I'm somehow missed something?

1

There are 1 best solutions below

0
On BEST ANSWER

things start to get weird when I try to calculate the bias

Thing got weird before that: notice that your $p(y|\mu)$ is not a valid density, it decreases as $1/y$, hence it cannot integrate to one.

The error is in the integration, it should rather be

$$p(y|\mu) = \frac{\mu}{(y+\mu)^2} \hspace{1cm} (y \ge 0)$$

Perhaps you can go on from here.

BTW: You've assumed that $p(y|x,\mu) = p(y|x)$ which might be true, but it does not follow from the problem statement.