UMVUE over a discrete distribution

165 Views Asked by At

The problem is: let $X$ a random variable such that $P(X=x)=\left\{\begin{array}{cl}2p(1-p)&\mbox{if }x=-1\\p^x(1-p)^{3-x}&\mbox{if }x\in\{0,1,2,3\}\end{array}\right.$

Find, if there exist, an UMVUE for $p$ and an UMVUE for $p(1-p)$

My attempt was to find the maximum likelihood estimator for $p$, but it is very difficult (since basically $P(X=k)$ is a sum of two functions, so the product $f(p,x_1)...f(p,x_n)$ and $\log(f(p,x_1)...f(p,x_n))$ are very ugly).

What do you recommend me?

1

There are 1 best solutions below

2
On

To prove/disprove existence of UMVUE, you need a characterization of the best unbiased estimator. A necessary and sufficient condition for an unbiased estimator to be UMVUE is that it must be uncorrelated with every unbiased estimator of zero.

Here is a very brief outline of tackling the problem:

Let $h(X)$ be an unbiased estimator of zero, i.e.

$$E\left[h(X)\right]=0 \quad, \forall\,p \tag{1}$$

This would give you a condition on the function $h$.

Then using the characterization stated at the start, a statistic $T(X)$ is UMVUE of its expectation iff

$$\operatorname{Cov}(T(X),h(X))=0 \iff E\left[T(X)h(X)\right]=0 \quad, \forall\,p \tag{2}$$

So $T(X)h(X)$ is also an unbiased estimator of zero, which again gives you a condition on the function $T\cdot h$. Compare $(1)$ and $(2)$ to see if you can get a condition on the function $T$.

Find $E\left[T(X)\right]$ and use what you know about $T$ to see which functions of $p$ admit a UMVUE.