Conditional probability for mixed distributions

41 Views Asked by At

I trying to solve a problem involving $Y = X_{1} + \cdots + X_{M}$ and $M$. We have that all $X_{i}$ are independent of $M$. The $X$ are i.i.d. For simplicity lets assume that all $X_{i} \sim \mathrm{Gamma}(k,\theta)$ and $M \sim \mathrm{Poisson}(\lambda)$. I want to calculate the following $P(Y > \gamma)$. How I was thinking about solving this problem is as follows: $$P(Y > \gamma | M) = \frac{P(Y > \gamma, M)}{P(M)} = \frac{P(Y > \gamma)P(M)}{P(M)} = P(Y > \gamma) = 1 - F_{gamma}(\gamma),$$ where $F_{gamma}(\gamma)$ is the Gamma CDF. For some reason I don't feel like I'm doing it right. Any ideas on how I can calculate $P(Y > \gamma)$ or references texts (should not be too heavy as I'm still trying to grasp this).

1

There are 1 best solutions below

1
On BEST ANSWER

$Y|M=m$ is the sum of $m$ i.i.d. Gammas with parameters $(k,\theta)$, so $$ Y|M=m \sim \text{Gamma}(mk, \theta). $$ From here, we use the law of total probability to write: \begin{align*} P(Y>y) &= \sum_{m=0}^\infty P(Y > y|M=m) P(M=m)\\ &= \sum_{m=0}^\infty P(\text{Gamma}(mk, \theta) > y) \frac{e^{-\lambda} \lambda^m}{m!}\\ &= \sum_{m=0}^\infty \int_y^\infty \frac{1}{\Gamma(mk)} \theta^{mk} x^{mk-1} e^{-x/\theta}dx \frac{e^{-\lambda} \lambda^m}{m!}\\ \end{align*}

I'm not really sure that this can be simplified further. Note that you can compute other properties in closed form, for example

$$ \mathbb{E} Y = \mathbb{E}(\mathbb{E}[Y|M]) = \mathbb{E}[km\theta]=k\theta \mathbb{E}[m] = k\theta \lambda. $$ See also this post which derives the MGF.