What is the distribution of $P_M(M_B(t))$

45 Views Asked by At

$M_X(t)= P_M(M_B(t))$

$P_M(s)= (1-q+qs)^2 $

$M_B(t)= \frac{\beta}{\beta-t}$

Where P(x) is the probability generating function and M(x) is a moment generating function

I identified M as $M \sim binomial (2,q)$ and $B \sim exp (\beta)$

but I don't know what is the distribution of X?

The solution tell $X \sim BinomComp (2,q,F_B)$ but I didn't understand what this distribution is.

I know what a mixture distribution is , I know conditional distributions but I've never seen something like this. Any explanation is very appreciated. Thank you!

1

There are 1 best solutions below

0
On BEST ANSWER

You are asking what distribution corresponds to the $M_X(t)$. This is difficult to get unmotivated but in fact it is known that the composition of a probability generating function of an $\mathbb{N}_0$ random variable and an MGF (or CF) is the MGF of a random sum of independent random variables. To see this we see that if,

$$X = \sum_{i=1}^M B_i $$

Where $B_i$ are iid random variables distributed the same way $B$ is, and $M$ is an independent $\mathbb{N}_0$ random variable (which you identified as being $\text{Binomial}(2,q)$). Thus, $S$ is a random sum, to find the MGF of $S$ we see that,

$$M_X(t) = \mathbb{E} [e^{tX}] = \sum_{n \in \mathbb{N}_0} \mathbb{E}[e^{tX} \ | \ M = n] \cdot \mathbb{P}[M = n] $$

Where the second equality comes from the law of total expectation. Note that $X$ conditional on $M$ is just a sum of the $B$ distributed random variables

$$X \ | \ M = n \sim \sum_{i=1}^n B_i $$

Therefore we have,

$$\mathbb{E}[e^{tX} \ | \ M = n] = \mathbb{E}\left[e^{t\sum_{i=0}^n B_i} \right] = \mathbb{E} \left[\prod_{i=0}^n e^{tB_i} \right] = (\mathbb{E} e^{tB})^n$$

Note there is a subtle issue here when $n = 0$. This corresponds to a degenerate case of the random sum where $X = 0$ and so the MGF is identically $1$. Putting these facts together we obtain,

$$M_X(t) =\sum_{n \in \mathbb{N}_0} \mathbb{E}[e^{tX} \ | \ M = n] \cdot \mathbb{P}[M = n] = \sum_{n \in \mathbb{N}_0} (\mathbb{E} e^{tB})^n \cdot \mathbb{P}[M = n] = \mathbb{E}[(\mathbb{E} e^{tB})^M] = P_M(M_B(t)) $$

Therefore we've proven that the distribution of $X$ is simply as described,

$$X = \sum_{i=1}^M B_i $$

To find this distribution explicitly we find the CDF of $X$ through the law of conditional probability where for $x \geq 0$,

$$\mathbb{P}[X \leq x] = \sum_{n=0}^2 \mathbb{P}[X \leq x \ | \ M = n] \cdot \mathbb{P}[M = n]$$

Expanding this out we have,

$$\mathbb{P}[X \leq x] = (1-q)^2+ 2q(1-q) \mathbb{P}[B_1 \leq x] + q^2 \mathbb{P}[B_1 + B_2 \leq x]$$

$B_1 \sim \text{Exp}(\beta)$ and since $B_1, B_2$ are independent exponential, their sum is (shape-rate parameterisation) $B_1 + B_2 \sim \text{Gamma}(2, \beta)$ distributed, and so we have after differentiating,

$$f_X(x) = 2q(1-q) \cdot \beta e^{-\beta x} + q^2 \cdot \beta^2 x e^{-\beta x} $$

Finally,

$$f_X(x) = \beta e^{-\beta x} (2q(1-q) + \beta q^2 x) $$