Consistensy of a Bayes Estimator

75 Views Asked by At

Let $X_1, \dots, X_n$ be a sample from $\mathrm{Exp}(\theta)$, $\theta \sim \Gamma(\alpha, \beta), \; \alpha, \beta > 0$. It is easy to prove that $\theta | X \sim \Gamma(\alpha, \beta + n \overline{X})$, so the Bayes estimator for $\theta$ is $\hat{\theta} = \mathbb{E}[\theta | X] = \frac{\alpha}{\beta + n\overline{X}}$. But is this estimator consistent? I think the answer is no, but I'm not sure about my argument.

Convergence in probability of $\hat{\theta}$ to $\theta$ implies convergence in distribution. But $n\overline{X} \geq 0$, so $0 \leq \hat{\theta} \leq \frac{\alpha}{\beta} \; \forall n \in \mathbb{N}$, and $F_{\hat{\theta}}(\frac{\alpha}{\beta}) = 1 \not\to F_{\theta}(\frac{\alpha}{\beta}) \neq 1$. So $\hat{\theta}$ cannot converge to $\theta$ in probability. Is it correct?

Thank you in advance

2

There are 2 best solutions below

4
On

In some sense the statement is true and in some sense it's not true.

The estimate $\frac{\alpha}{\beta+n \bar{X}}$ is not consistent. Indeed, $\bar{X}$ converges a.s. to $\mathbf{E} X_1 = \frac{\alpha}{\beta}$ and hence $$\frac{\alpha}{\beta+n \bar{X}} = \frac{\frac{\alpha}n}{\frac{\beta}n + \bar{X}} \to \frac{0}{0+\frac{\beta}{\alpha}} = 0$$ almost surely.

But the Bayes estimator for $\theta$ is $\frac{\alpha + n}{\beta+n \bar{X}}$ and it is consistent. Indeed, $$p(\theta|X) = \frac{\theta^n e^{-\theta \sum X_i} \frac{\beta^{\alpha}\theta^{\alpha-1}e^{-\beta \theta}}{\Gamma(\alpha)} }{\int_{0}^{\infty}\ldots } \sim \Gamma(n+\alpha, \beta+\sum X_i),$$ $$\hat{\theta} = \frac{n+\alpha}{\beta+\sum X_i} = \frac{\frac{\alpha}n + 1}{\frac{\beta}n + \bar{X}}$$ so $\hat{\theta}$ converges a.s. to $ \frac{0+1}{0+ \frac{\beta}{\alpha}} = \frac{\alpha}{\beta}$.

0
On

Your reasoning is fine, however I think you have made a mistake in the derivation of the Bayes estimator. In particular, I find that $\theta\mid\mathsf{X}\sim\Gamma(n+\alpha, n\overline{\mathsf{X}}+\beta)$ leading to the estimator $\widehat{\theta}=\frac{n+\alpha}{n\overline{\mathsf{X}}+\beta}$, which is indeed consistent (it is asymptotically equivalent to $\frac{1}{\overline{\mathsf{X}}}$, which converges in probability to $\theta$, see e.g. this question regarding the law of large numbers for conditional expectations).