Maximum likelihood for $(\mu,\sigma)$ and other related questions

370 Views Asked by At

$$f(x)=\frac{1}{2\sigma}\exp\left(\frac{-|x-\mu|}{\sigma}\right)$$

$$\mu\in,\sigma>0$$

  1. When trying to calculate the maximum likelihood for $(\mu,\sigma)$, I got as far as: $\log L(\mu,\sigma)=-n \log(2\sigma)-\frac{-\sum|x_i-\mu|}{\sigma}$ and I'm not really sure how am I supposed to calculate the derivative according to $\mu$ and according to $\sigma$.

  2. I also got stuck in the calculation of $E(x)$ and $E(x^2)$ and would really appreciate any assistance. I think I should recognize a PDF in the integral, but since the limits do not span the entire range (because of the abs..) I'm not sure how to do it.

Thanks!!

1

There are 1 best solutions below

0
On

You need the value of $\mu$ that minimizes $\displaystyle g(\mu)=\sum_{i=1}^n |x_i-\mu|$.

I would not start by assuming that finding a derivative is the best way to do that; it's just one way to consider using.

Suppose $\mu>\text{some $x$ values}$ and $\mu<\text{some other $x$ values}$. Then $$ g(\mu)=\sum_{i=1}^n |x_i-\mu| = \left(\sum_{i\in\text{(one set)}} (x_i - \mu)\right)+\left( \sum_{i\in\text{(another set)}} (\mu-x_i) \right). $$ As $\mu$ gets bigger, the first of these sums gets smaller and the second gets bigger. So does the entire sum get bigger or smaller? The one that prevails is just the one that has more terms. If the first sum above has more terms than the second $g(\mu)$ gets bigger as $\mu$ gets bigger; if the second one has more terms than the first, then $g(\mu)$ gets smaller as $\mu$ gets bigger. Imagine $\mu$ less than all the $x$s, and increasing until it's bigger than all the $x$s. $g(\mu)$ is getting smaller until $\mu$ is bigger than just as many $x$s as $\mu$ is smaller than. After that $g(\mu)$ is getting bigger.

In other words, $g(\mu)$ reaches its smallest value when $\mu=$ the sample median.

If there are an even number of $x$s, then there's a non-unique median: the middle two $x$s and every number in between is a median. In that case, the MLE for $\mu$ is not unique. As $\mu$ moves from the smaller of the two middle values toward the bigger, the sum above is not changing: the first term gets smaller at the same rate at which the second term gets bigger. If there are an odd number of $x$s, then there's a unique MLE for $\mu$.

(Differentiating with respect to $\sigma$ is routine.)

$$ \int_{-\infty}^\infty x^2 \frac{1}{2\sigma}\exp\left( \frac{-|x-\mu|}{\sigma} \right) \, dx=\int_{-\infty}^\mu\cdots\cdots+\int_\mu^\infty\cdots\cdots. $$ In the first integral you have $x<\mu$ so $|x-\mu|=\mu-x$, and in the second $x>\mu$ so $|x-\mu|=x-\mu$. Then integrate by parts.

Also, some simplification might be achieved in this way: $$ \int_\mu^\infty x^2 h(x-\mu)\,dx = \int_0^\infty (u+\mu)^2 h(u)\,du $$ $$ = \int_0^\infty u^2 h(u)\,du + 2\mu\int_0^\infty u h(u)\,du + \mu^2\int_0^\infty h(u)\,du, $$ and the last integral should be $1/2$, by symmetry.