Bayesian Statistics: Estimators and Posterior Probability

125 Views Asked by At

If I let $M ∼ Γ(α,β)$ (where $α, β$ are known) Let $X_1,...,X_n$ be discrete random variables such that $X_i$|$θ$ ∼ i.i.d. Poisson with parameter $θ$, where $θ$ is a realization of $M$.

I have two questions...

  1. How do I compute the posterior probability for $θ$?
  2. How can I then compute the Bayesian estimators of $θ$ for the quadratic loss?

Here is the solution I came up with so far...

Y~Γ(α,β) if $\frac{1}{Γ(α)β^α}\theta^{\alpha-1}e^{-\theta/\beta}$ ...(0,inf)

Z~P($\theta$) : $P_Z(z)=e^{-\theta}\frac{\theta^z}{z!}$

$f_{Y,X_1,X_2...}(\theta,x_1,x_2,...)$ =(Bayes) $f_Y(\theta)$... (???)

=$\frac{1}{Γ(α)β^\alpha}\theta^{\alpha-1}e^{-\theta/\beta}$ where $\theta$ limits are (0,inf)

Γ(α',β') with parameters α'=$\alpha+\sum_{1}^n x_i$, and β' = ???

I have no idea where to go from here.

1

There are 1 best solutions below

0
On

It seems that your parameter $\theta\sim\Gamma(\alpha,\beta)$ and observations $(X_1,X_2,\ldots,X_n)\sim P(\theta)$.

So prior density $\pi(\theta)$=density of $\Gamma(\alpha,\beta)$ and p.d.f of the random vector $P(x_1,x_2,\ldots,x_n\setminus \theta)=P(\theta)$.

Therefore, posterior density is =$f(\theta\setminus x_1,x_2,\ldots,x_n)=\frac{\pi(\theta)P(x_1,x_2,\ldots,x_n \setminus \theta)}{\int_0^\infty \pi(\theta)P(x_1,x_2,\ldots,x_n \setminus \theta)d\theta}$ where $\theta\in (0,\infty)$.

Bayes estimator of $\theta$ with respect to quadratic loss is the posterior mean, that is $E(\theta\setminus x_1,x_2,\ldots,x_n)=\int_0^ \infty\theta f(\theta\setminus x_1,x_2,\ldots,x_n)d\theta$.