Finding the minimum-variance unbiased estimator of a rare distribution (namely $\frac{\log 3}{3^\theta -1} 3^x$)

440 Views Asked by At

I am interested in solving this non-standard exercise

If $(X_1,\ldots,X_n)$ is a simple random sample of a population with density function

$$f_\theta(x)=\frac{\log 3}{3^\theta -1} 3^x \ \ \ \ ,\ \ 0<x<\theta$$

Find the UMVUE for $3^\theta$

I have tried some typical estimators, but I find that demonstrating they are unbiased or complete is pretty tough with the given distribution. I am convinced that using the Lehmann–Scheffé theorem is the best way to solve the problem, but obviously I am missing something.

1

There are 1 best solutions below

4
On BEST ANSWER

Noting that $X \in (0,\theta)$, it would make sense for $X_{(n)} = \max\{X_1,...,X_n\}$ to be a complete sufficient statistic.

Working under that framework, given that $f_\theta(x) = \frac{3^x\log3}{3^\theta - 1}I(x\in [0,\theta])$, then $f_\theta(x_{(n)}) = \frac{n(3^{x_{(n)}}-1)^{n-1}3^{x_{(n)}}}{(3^\theta -1)^n}I(x_{(n)}\in [0,\theta])$.

We can show that $X_{(n)}$ is complete by using the Fisher-Neyman factorization theorem on the likelihood. The joint likelihood of multiple observations is given by:

\begin{equation} \prod_{i=1}^n\frac{3^{x_i}\log3}{3^\theta - 1}I(x_i\in [0,\theta]) = \prod_{i=1}^n3^{x_i}\log3\prod_{i=1}^n\frac{I(x_i\in [0,\theta])}{3^\theta - 1} \end{equation}

However, observe that $\prod_{i=1}^nI(x_i\in [0,\theta]) = I(x_{(n)}\in [0,\theta])$, meaning our joint likelihood has the form:

\begin{equation} \prod_{i=1}^n3^{x_i}\log3\frac{I(x_{(n)}\in [0,\theta])}{(3^\theta - 1)^n} = h(x)g(x_{(n)},\theta) \end{equation}

Which satisfies the requirements of the Fisher-Neyman factorization theorem.

To show that $X_{(n)}$ is complete, note that, given a function $f$ such that $E_\theta[|f(x)|] < \infty$, $E_\theta[f(X_{(n)})] = \int_{0}^\theta f(x_{(n)})\frac{n(3^{x_{(n)}}-1)^{n-1}3^{x_{(n)}}}{(3^\theta -1)^n}I(x_{(n)}\in [0,\theta])$. Setting this equal to $0$ gives us:

\begin{equation} \begin{split} \int_{0}^\theta f(x_{(n)})\frac{n(3^{x_{(n)}}-1)^{n-1}3^{x_{(n)}}}{(3^\theta -1)^n}I(x_{(n)}\in [0,\theta])dx_{(n)}& = 0\\ \Leftrightarrow f(\theta)\frac{n(3^{\theta}-1)^{n-1}3^{\theta}}{(3^\theta -1)^n}& = 0\\ \Leftrightarrow f(\theta)n3^{\theta}& = 0 \end{split} \end{equation}

Note that because $\forall \theta >0$ $3^\theta\neq 0$ we have the previous expression is only true if $f(\theta)=0$ for all $\theta \in \mathbb{R}^+$. This would indicate that $f(X_{(n)})=0$ almost surely as $f(x) \equiv 0$

Knowing that the sample maximum is a complete sufficient statistic, we can apply the Lehmann-Scheffe theorem. Following along with example 3.3.2 here, we set up the following Volterra integral equation for a general estimator $\delta(x_{(n)})$ of a function of the parameter $g(\theta)=3^\theta$:

\begin{equation} \frac{3^\theta (3^\theta -1)^n}{n} = \int_0^\theta\delta(x_{(n)})(3^{x_{(n)}}-1)^{n-1}3^{x_{(n)}}I(x_{(n)} \in [0,\theta]) \end{equation}

Taking the derivative with respect to $\theta$ of both sides gives us:

\begin{equation} \frac{\log 3}{n}[3^\theta(3^\theta-1)^n +n3^\theta(3^\theta-1)^{n-1}] = \delta(\theta)(3^{\theta}-1)^{n-1}3^{\theta} \end{equation}

Which gives us that $\delta(\theta) = \log 3[\frac{3^\theta-1}{n}+1]$.

Evaluating this expression at $\theta = X_{(n)}$ gives us: $\delta(X_{(n)}) = \log 3[\frac{3^{X_{(n)}}-1}{n}+1]$