Joint distribution and conditional distribution

190 Views Asked by At

Let

\begin{align} Y&\sim \text{gamma}(\alpha, \lambda)\\ X|Y=y&\sim \text{Poisson}(y). \end{align}

Find the conditional distribution $Y|X=x$.

How to do that? I'm confused because the gamma distribution is a continous distribution and Poisson distribution is a discrete distribution. Can it be solved?

I just can find distribution of $(X,Y)$. Then I don't know how to find the distribution of $X$ (marginal pdf for $X$).

Thanks for any help.

2

There are 2 best solutions below

0
On

So by Bayes' rule, you have,

$$P(Y|X) = \frac{P(X|Y)P(Y)}{P(X)}=\frac{P(X|Y)P(Y)}{\int_YP(X|Y)P(Y)}.$$

Since you're given both $P(X|Y)$ and $P(Y)$ in the problem, then all you need to do is compute the integral in the denominator of the above expression,

$$P(X=x)=\int_0^\infty\frac{x^ye^{-x}}{-x!}\cdot\frac{1}{\Gamma(\alpha)\lambda^\alpha}y^{\alpha-1}e^{-\frac{y}{\lambda}}dy.$$

(I'm assuming that your gamma distribution is shape-scale parametrized).

This integral may be intractable to compute directly, and if so then to determine the posterior you can simply use the fact that the gamma distribution is the conjugate prior for the Poisson distribution.

3
On

I will assume that $\lambda$ is the rate parameter, so that

$$ Y \sim \operatorname{Gamma}(\alpha,\lambda) \qquad\Leftrightarrow\qquad f_Y(y) = \frac{\lambda^{\alpha}y^{\alpha-1}e^{-\lambda y}}{\Gamma(\alpha)} \mathbf{1}_{\{y \geq 0\}}. $$

Step 1. We begin by making a general observation. (You can skip directly to the boxed equation $\text{(1)}$ if you are already aware of this formula.)

Let $X$ has discrete distribution and $Y$ has continuous distribution. Then by applying the law of iterated expectations,

$$ \begin{aligned} \mathbb{P}(Y \leq y, X = x) &= \mathbb{E}\big[\mathbf{1}_{\{X = x\}} \mathbf{1}_{\{Y \leq y\}}\big] = \mathbb{E}\big[ \mathbb{P}(X = x \mid Y) \mathbf{1}_{\{Y \leq y\}}\big] \\ &= \int_{-\infty}^{y} \mathbb{P}(X = x \mid Y = s) f_Y(s) \, \mathrm{d}s, \end{aligned} $$

where $f_Y$ is the PDF of $Y$. Using this, we compute

$$ \mathbb{P}(Y \leq y \mid X = x) = \frac{\mathbb{P}(Y \leq y, X = x)}{\mathbb{P}(X = x)} = \frac{\int_{-\infty}^{y} \mathbb{P}(X = x \mid Y = s) f_Y(s) \, \mathrm{d}s}{\int_{-\infty}^{+\infty} \mathbb{P}(X = x \mid Y = s) f_Y(s) \, \mathrm{d}s}. $$

Differentiating both sides with respect to $y$, we get the conditional PDF of $Y$ given $X$ as

$$ f_{Y|X=x}(y) = \frac{\mathbb{P}(X = x \mid Y = y) f_Y(y)}{\int_{-\infty}^{+\infty} \mathbb{P}(X = x \mid Y = s) f_Y(s) \, \mathrm{d}s}. \tag{1}$$

Step 2. Now we return to OP's problem. First, notice that for any $x \in \{0,1,2,\cdots\}$, we get

\begin{align*} \int_{-\infty}^{+\infty} \mathbb{P}(X = x \mid Y = s) f_Y(s) \, \mathrm{d}s &= \int_{0}^{\infty} \frac{s^x e^{-s}}{x!} \frac{\lambda^{\alpha}s^{\alpha-1}e^{-\lambda s}}{\Gamma(\alpha)}\, \mathrm{d}s \\ &= \frac{\Gamma(x+\alpha)}{x!\Gamma(\alpha)} \cdot \frac{\lambda^{\alpha}}{(\lambda+1)^{x+\alpha}}. \end{align*}

Then plugging all the known information to $\text{(1)}$, for any $x \in \{0,1,2,\cdots\}$, we get

$$ f_{Y|X=x}(y) = \frac{\bigg(\dfrac{y^x e^{-y}}{x!} \cdot \dfrac{\lambda^{\alpha}y^{\alpha-1}e^{-\lambda y}}{\Gamma(\alpha)}\mathbf{1}_{\{y \geq 0\}} \bigg) }{ \bigg( \dfrac{\Gamma(x+\alpha)}{x!\Gamma(\alpha)} \cdot \dfrac{\lambda^{\alpha}}{(\lambda+1)^{x+\alpha}} \bigg) } = \frac{(\lambda+1)^{x+\alpha} y^{x+\alpha-1}e^{-(\lambda+1) y}}{\Gamma(x+\alpha)}\mathbf{1}_{\{y \geq 0\}}. $$

Therefore $Y$ given $X=x$ is distributed as $\operatorname{Gamma}(x+\alpha,\lambda+1)$.