Is the estimator $\frac{\overline X}{1-\overline X}$ of $\theta$ consistent?

154 Views Asked by At

Let $X_1,X_2,\ldots,X_n$ be a random sample from a population having the probability density function $$ f(x;\theta) = \begin{cases} \theta x^{\theta -1} & \text{if $0 \le x\le$ 1} \\0 &\text{otherwise} \end{cases}$$

Is the estimator $\hat\theta =\dfrac{\overline X}{1-\overline X}$ of $\theta$ a consistent estimator of $\theta?$

I am trying to find $E(X) $ here to see if it is equal to $\theta$ asymptotically but I am not sure how to find expectation here I am not familiar with finding $E(X)$ of fractions.

2

There are 2 best solutions below

14
On BEST ANSWER

I am trying to find $E(X) $ here to see if it is equal to $\theta$ asymptotically

It's not $E(X)$ that should be equal to $\theta$ asymptotically, it's $\hat{\theta}$.

Let's find $E(X)$ as you suggest $$E(X) = \int_0^1 xf(x;\theta) \ dx = \int_0^1 \theta x^{\theta} = \frac{\theta}{\theta+1} $$

Now let's see in the asymptotic regime how $\hat{\theta}$ behaves, $$\hat{\theta} = \dfrac{\bar X}{1-\bar X} \rightarrow \frac{E(X)}{1-E(X)} = \frac{\frac{\theta}{\theta+1} }{1-\frac{\theta}{\theta+1} } = \frac{\theta}{\theta+1} \frac{\theta+1}{1} = \frac{\theta}{1} = \theta $$ So, what can you say about $\hat{\theta}$ ?

2
On

I suppose the answer is no. Perhaps we may do as follows.

Denote \begin{align} S_n&=\frac{1}{n}\sum_{j=1}^nX_j,\\ T_n&=\frac{S_n}{1+S_n}. \end{align} Let us use the strong law of large number (SLLN) and the dominated convergence theorem (DCT) to go for the consistence.

For one thing, each $X_j\in\left[0,1\right]$ implies that $S_n\in\left[0,1\right]$. Consequently, $T_n\left[0,1/2\right]$, which implies that $\left|T_n\right|\le 1/2$ for all $n\in\mathbb{N}$. This upper bound $1/2$ is obviously integrable under the probability measure, i.e., $\mathbb{E}(1/2)=1/2<\infty$.

For another, by SLLN, $$ \lim_{n\to\infty}S_n=\mathbb{E}X_1=\frac{\theta}{1+\theta}, $$ which thus leads to $$ \lim_{n\to\infty}T_n=\frac{\lim_{n\to\infty}S_n}{1+\lim_{n\to\infty}S_n}=\frac{\theta}{1+2\theta}. $$

Combine the above two facts, and DCT applies. This gives $$ \lim_{n\to\infty}\mathbb{E}T_n=\mathbb{E}\left(\lim_{n\to\infty}T_n\right)=\mathbb{E}\frac{\theta}{1+2\theta}=\frac{\theta}{1+2\theta}\ne\theta. $$ Therefore, $T_n$ is not a consistent estimator of $\theta$.

However, I suppose there might be a typo in $T_n$. In fact, if we consider another estimator $$ U_n=\frac{S_n}{1-S_n}, $$ the answer would be yes.

The scope of its proof is essentially the same as above, although some tricks are necessary. This is because DCT would fail as $U_n$ is no longer bounded from above (more precisely, it is not trivial to figure out some $V$ such that $\mathbb{E}V<\infty$ and that $\left|U_n\right|\le V$).

To help facilitate our proof, define $$ U_{mn}=U_n\cdot 1_{\left\{U_n\le m\right\}}. $$ Note that $U_n\ge 1$ is guaranteed because $S_n\in\left[0,1\right]$. Besides, note that $U_{mn}$ is monotone in $m$, i.e., $$ U_{mn}\le U_{m+1,n}. $$ Thanks to these two facts, the monotone convergence theorem (MCT) applies. It gives $$ \mathbb{E}\left(\lim_{m\to\infty}U_{mn}\right)=\lim_{m\to\infty}\mathbb{E}U_{mn}. $$ Consequently, $$ \lim_{n\to\infty}\mathbb{E}U_n=\lim_{n\to\infty}\mathbb{E}\left(\lim_{m\to\infty}U_{mn}\right)=\lim_{n\to\infty}\lim_{m\to\infty}\mathbb{E}U_{mn}. $$ Further, for each fixed $m$, $\mathbb{E}U_{mn}$ converges as $n\to\infty$ by DCT, and for each fixed $n$, $\mathbb{E}U_{mn}$ converges uniformly as $m\to\infty$ due to the uniform cutoff. These facts inspires that $$ \lim_{n\to\infty}\lim_{m\to\infty}\mathbb{E}U_{mn}=\lim_{m\to\infty}\lim_{n\to\infty}\mathbb{E}U_{mn}. $$

Thanks to all these arguments, we may safely conclude that $$ \lim_{n\to\infty}\mathbb{E}U_n=\lim_{m\to\infty}\lim_{n\to\infty}\mathbb{E}U_{mn}=\lim_{m\to\infty}\mathbb{E}\left(\lim_{n\to\infty}U_{mn}\right)=\cdots=\theta. $$