I have this problem where I'm not too sure on how to proceed.
I need to calculate $Pr(X=0 $ and $ Y=0)$ using the following information:
- The conditional distributions $f(x|\theta)$ and $f(y|\theta)$ are both distributed (Poisson, $\theta$).
- The marginal distributions are $f(x) = \frac{\beta^\alpha \Gamma(x+\alpha) }{(\beta+1)^{x+\alpha} \Gamma(\alpha)\Gamma(x+1)}$ and $f(y) = \frac{\beta^\alpha \Gamma(y+\alpha) }{(\beta+1)^{y+\alpha} \Gamma(\alpha)\Gamma(y+1)}$
- The prior distribution is $\pi(\theta)$ distributed (Gamma $\alpha, \beta$)
- X and Y are independent given $\theta$ (but not unconditionally)
I've tried to apply the formula $P(X=0 , Y=0)=P(X=0|Y=0)P(Y=0)$ but I have no idea how to calculate the P(X|Y) term.
Since we have the conditional independence of $X$ and $Y$ given $\theta$, we can write $$P(X=0, Y=0) = E[P(X=0,Y=0 | \theta)] = E[P(X=0|\theta)P(Y=0|\theta)] = E[e^{-2\theta}]$$
Then since $\theta$ follows the Gamma distribution $\Gamma(\alpha, \beta)$ and we are actually computing the Laplace transform(or moment generating function) of a Gamma distribution, so finally we get
$E[e^{-2\theta}] = (1+\frac{2}{\beta})^{-\alpha}$