Is this Bayes estimator result correct

148 Views Asked by At

I am trying to see where I went wrong in this calculation of the Bayes estimator or if there is a hole in my understanding. We have a common discrete density $$f(x|\theta)=\frac{\theta^{x}e^{-\theta}}{x!}, x=0,1,2,...$$ so a Poisson distribution. And we have a random sample $\textbf{X}=(X_1,...,X_n)$ from this distribution. A prior distribution of $q(\theta)=e^{-\theta}$ is used where we have $\theta>0$, so it's fair to say this is a proper prior.

I calculated the posterior density as $$\frac{(n+1)^{T_n+1}\theta^{T_n}e^{-\theta(n+1)}}{\Gamma (T_n+1)}$$ and so obtained the Bayes estimator $\hat{\theta_B}$ as $$\frac{T_n+1}{n+1}$$

However, I am meant to get that $E[\hat{\theta_B}]=1$, but with my Bayes estimator I get $E[\hat{\theta_B}]=\frac{n\theta+1}{n+1}$. I am not sure if my results were wrong or if I am missing something that $\theta=1$ here which would make the expected value equal $1$ for my result.

1

There are 1 best solutions below

9
On BEST ANSWER

There are some little errors...moreover I do not understand why you think to have found $\hat{\theta}=\frac{n\theta+1}{n+1}$ that is absolutely wrong (the estimator of $\theta$ still depends on $\theta$).

The posterior is

$q(\theta|\mathbf{x}) \propto e^{-(n+1)\theta}\theta^{(\sum_i x_i+1)-1}$

So you will immediately recognize a Gamma distribution:

$q(\theta|\mathbf{x})\sim Gamma[\sum_i x_i+1;n+1]$

So, one possible Bayes estimator (not the only way) is the conditional expectation of the posterior....so

$\hat{\theta}_{MMSE}=\mathbb{E}[\theta|\mathbf{x}]=\frac{\sum_i x_i+1}{n+1}$

Now it is correct.

Note:MMSE= minimum MSE