Sequentially defined probability density function and it's UMVUE estimator.

84 Views Asked by At

Let $f_{0}(x)$, $-\infty<x<\infty$ : probability density function(p.d.f.) and $$ F_{0}(x)=\int_{-\infty}^{x}f_{0}(t)\ dt $$ be the corresponding distribution function.

And let $X_{1}$, $\cdots$, $X_{n}$ be a random sample from distribution function $F(x;\theta)=\left(F_{0}\left(x\right)\right)^{\theta}$.

I want to find UMVUE of $1/\theta$.

1

There are 1 best solutions below

4
On

Consider $n=1$. We want an unbiased function of the complete sufficient statistic $\log F_0(X)$ or equivalently $F_0(X)$: $$ E_\theta [h(F_0(X)] = 1/\theta $$ This translate to $\theta^{-1} = \int h(F_0(x)) d [F_0(x)^\theta] = \int_0^1 h(t) d t^\theta = h(1) - \int_0^1 t^\theta h'(t) dt$. It seems like Beta integrals might be helpful.

From the integral above, $E h(F_0(X)) = \int_0^1 \theta t^{\theta-1} h(t) dt$, we see that $F_0(X)$ has a distribution with density $\theta t^{\theta-1} 1\{ t \in (0,1)\}$. This is a Beta distribution with parameters $\theta$ and $1$.

Guessing $h(t) = \sum_i a_i t^{\alpha_i} (1-\theta)^{\beta_i-1}$, have $E h = \sum_i a_i \theta B(\theta + \alpha_i,\beta_i)$. Something like this might work (no guarantees).