Estimating a parameter using the maximum likelihood-method and the method of moments

60 Views Asked by At

Let $X$ be a random variable that has a density function of the form

$f_X(x) = (p + 1) x^p 1_{[0, 1]}(x), x \in \mathbb{R}$

where $p > 0$ is an unknown parameter. I now want to make an educated "guess" for $p$, depending on the observation of $n$ independent copies $X_1, \dots, X_n$ of $X$.

First, I want to construct an estimation function $\hat{p}_n$ for $p$, using the maximum likelihood-method. Afterwards, I want to find an estimator using the method of moments, and compare the consistency of the two methods.

Now I started with the maximum likelihood-method, but I'm not sure if what I did is very reasonable. I think I would need to define the likelihood function $L(p)$ as

$f(x_1, \dots, x_n; p) = \prod_{i=1}^n f_{X_i} (x_i; p) =: L(p)$

and now try to maximize $L$ with respect to $p$.

I tried to differentiate $L(p)$, but that's were things got complicated. The expression

$L'(p) = \left(\prod_{i=1}^n f_{X_i}(x_i; p)\right)' = \left( \prod_{i=1}^n (p+1) x_i^p 1_{[0, 1]}(x_i) \right)' = \left( (p+1)^n x_1^p \dots x_n^p 1_{[0, 1]}(x_1) \dots 1_{[0, 1]}(x_n) \right)'$

seems very hard to evaluate explicitly. I let Wolframalpha calculate the derivative in case $n = 5$ and if we ignore the characteristic functions. Based on that result, I suspect (altough I haven't verified it yet, could probably be done by iteratively applying the product rule) that for any $n$, the derivative looks something like

$L'(p) = n (p+1)^{n-1} x_1^p \dots x_n^p + (p+1)^n x_1^p \dots x_n^p \log(x_1) + \dots + (p+1)^n x_1^p \dots x_n^p \log(x_n) $

(Although I'm not entirely sure where these $log$'s come from in Wolframalphas result.)

Now if that result is true, then the expression can only become $= 0$ if $p = -1$, can it? (I'm note entirely sure what to do if one of the random variables is $≤ 0$, since the $\log(x_i)$ isn't defined then.)

So would $p = -1$ be the result of a maximum likelihood-estimation? The result seems questionable to me though, since I ignored certain things like the characteristic functions when differentiating, or the case that one of the $x_i$'s is equal to $0$. Or is there an easier way that I have missed? I have read that sometimes applying the logarithm would make things easier, but I don't really see at which point I could apply the logarithm here.

For the second part of the question, i.e. finding an estimation using the method of moments, I must admit that I can't really figure out how to get started here.

1

There are 1 best solutions below

0
On BEST ANSWER

In any manner, if $$L(p)=\prod_{i=1}^n f_{X_i} (x_i; p)$$ then $$\log\left(L(p)\right)=\sum_{i=1}^n\log\left(f_{X_i} (x_i; p)\right)$$ $$\frac{L'(p)}{L(p)}=\sum_{i=1}^n \frac{f'_{X_i} (x_i; p)}{f_{X_i} (x_i; p)}$$ and since you want $L'(p)=0$, the rhs seems (at least to me) simpler to manipulate.