Method of moments estimator for lognormal distribution

2k Views Asked by At

Let $X_1,\cdots X_n$ be identically and independently distributed lognormally. I want to find the method of moments estimators for $\mu,\sigma^2$.

We know that $E[X] = e^{\mu+\frac{\sigma^2}{2}}$, $E[X^2] = e^{2\mu + 2\sigma^2}$. Then we have $\mu + \frac{\sigma^2}{2} = \log\left( \frac{\sum X_i}{n} \right)$. Hence, $\mu = \log(\sum X_i) - \log(n) - \frac{\sigma^2}{2}$.

Further, $e^{2\mu + 2\sigma^2} = \frac{\sum X_i^2}{n}$ which gives $2\mu + 2\sigma^2 = \log \left( \frac{\sum X_i}{n} \right)$, and after some algebraic manipulation we have $\mu = \log \left( \frac{\log(\sum X_i^2)}{2} \right) - \frac{\log(n)}{2} - \sigma^2$.

Equating these gives $\sigma^2_{MM} = \log(\sum{X_i^2}) - 2\log(\sum X_i) + \log(n)$, and in the exact same way $\mu_{MM} = -\frac{\log(\sum X_i^2)}{2} + 2\log(\sum X_i) - \frac{3}{2}\log (n)$.

However the wikipedia page gives $\mu_{MM} = \log \left( \frac{E(X)^2}{\sqrt{\text{Var}(X) + E(X)^2}} \right), \sigma^2_{MM} = \log \left( \frac{\text{Var}(X)}{E(X)^2} + 1 \right)$.

Have I done it wrong and if so where is the mistake? Thanks:)

2

There are 2 best solutions below

0
On

They are essentially the same:

From $E[X]=e^{\mu+\frac{\sigma^2}{2}}$, $E[X^2]=e^{2\mu+2\sigma^2}$

you can say $\log(E[X])=\mu+\frac{\sigma^2}{2}$ and $\log(E[X^2])=2\mu+2\sigma^2$

and so $\sigma^2=\log(E[X^2])-2\log(E[X])$ and $\mu=2\log(E[X])-\frac12 \log(E[X^2])$.

which is essentially what you wrote but with expectations rather than sums, i.e. with the $n$s inside the logarithms.

Combining the logarithms and using the variance rather than the raw second moment gives $\sigma^2=\log\left(\frac{E[X^2]}{(E[X])^2}\right)= \log\left(\frac{\text{Var}(X)+(E[X])^2}{(E[X])^2}\right) =\log\left(\frac{\text{Var}(X)}{(E[X])^2}+1\right)$ and $\mu=\log\left(\frac{(E[X])^2}{\sqrt{E[X^2]}}\right)=\log\left(\frac{(E[X])^2}{\sqrt{\text{Var}(X)+(E[X])^2}}\right)$

which is what Wikipedia has.

0
On

A minor point of clarification: it is important to distinguish between an expectation and an estimate. A method of moments estimator should not be written in terms of expectations, but rather, in terms of the sample, because the estimator is a statistic. As such, I regard the expression that you wrote to be the more notationally proper one, because it makes clear that these are estimators of $\mu$ and $\sigma^2$.

So why does the Wikipedia article use expectations? Why do they refer to "arithmetic moments?" The reason is, if you knew these values, you can solve for the parameters; that is to say, that section of the article isn't specifically referring to the method of moments estimators, just the relationship between the moments of the distribution and its parameters. It just so happens that if you replace the moments with their sample moments, you will get the method of moments estimators for the parameters--because this is precisely how the method of moments is calculated.