Moment Estimate of theta

9.6k Views Asked by At

Consider a random variable $X$ whose pdf is $f(x;θ)=θx^{θ−1}$ for $0<x<1$ and zero otherwise.

i) Show this is a density function

ii) determine the moment estimate of theta on the basis of a random sample $x_1,\ldots,x_n$

for part ii) does this mean use the method of moments to find the estimator?

2

There are 2 best solutions below

0
On BEST ANSWER

I think Andre Nicholas adequately answered your question regarding f>0. I will address the moment estimate:

The idea of a moment estimate is to relate a moment to the desired parameter, then estimate it using the sample moment estimates. Lets try the expected value of X:

$E[X] = \int\limits_{0}^1 x(\theta x^{\theta-1}) dx = \int\limits_{0}^1 \theta x^{\theta} dx = [\frac{\theta x^{\theta+1}}{\theta+1}]_{x=0}^{x=1} = \frac{\theta}{1+\theta}$

Therefore, you can estimate $\theta$ with $\frac{\hat\theta}{1+\hat\theta}= \bar x\rightarrow \hat\theta = \frac{-\bar x}{\bar x-1}$. That is the moment estimate. Contrast this with the maximum likelihood estimate of a sample:

Likelihood ($L$) of sample of values, $x=\{x_1...x_N\}$ is $L(x;\theta)=\theta^N\prod\limits_{i=1}^Nx_i^{\theta-1}$. Taking the logarithm, we get the log-likelihood function, which is easier to optimize:

$\mathcal{L}(x;\theta) = N\ln(\theta) + (\theta-1)\sum\limits_{i=1}^N \ln(x_i)$

Take the derivative wrt $\theta$ and set equal to $0$. Solving for $\theta$ we get:

$\hat\theta_{mle} = \frac{-N}{\sum\limits_{i=1}^N \ln(x_i)}$ which will in general give different estimates from the method of moments.

Sometime the MLE is a better (i.e., more efficient) estimator. I tested this by simulating from your density for $\theta = 3$, and seeing the sampling distribution of each estimator for a sample size of 20. See below:

MLE MOM

As you can see, the distributions are almost identical, so neither is really preferred here.

0
On

For the second part, the method of moments simply indicates that we equate each population raw moment with the corresponding sample raw moment; that is to say, let $${\rm E}[X^k] = \frac{1}{n} \sum_{i=1}^n X_i^k,$$ for as many positive integers $k$ that will uniquely determine all the parameters of the distribution. In this case, there is only a single parameter, so using the $k = 1$ case, we want to find the value $\theta$ such that ${\rm E}[X] = \bar x$, the sample mean. This requires us to determine the expected value of $X$, which is straightforward: $${\rm E}[X] = \int_{x=0}^1 x f(x;\theta) \, dx.$$ I leave the remainder to you as an exercise.