$X:= \sum_{j=1}^Z Z_j$, where $Z \sim$ Po$(\gamma)$ and $Z_j$'s are independent

467 Views Asked by At

Let $m\in \mathbb N$ and suppose that $Z_n, n\in \mathbb N$, is a sequence of independent random vectors in $\mathbb R^m$ with common distribution $\mathbb P(Z_1 = e_i) = p_i, i\in \{ 1,...,m\}$, where $e_i$ is the $i-$th unit vector in $\mathbb R^m$ and $p_1+ \cdots+ p_m = 1$. Let $Z \sim$ Po$(\gamma)$, independent of $(Z_1, Z_2,...).$ Show that the components of the random vector $X := \sum_{j=1}^Z Z_j$ are independent and Poisson distributed with parameters $p_1\gamma, ..., p_m \gamma$.

This can be proved if we show that $$\mathbb P(X_1 = k_1, ... , X_n = k_n) = \prod_{j=1}^n \mathbb P(X_j = k_j) = \prod_{j=1}^n \text{Po}(k_j; p_i \gamma).$$

How to prove the above equation? Is there any other easy way to prove the statement?

Can we argue that the components of $X$ are indepedent by that $Z_n, n\in \mathbb N$, is a sequence of independent random vectors?

2

There are 2 best solutions below

0
On BEST ANSWER

Given that the random vector $\textbf X$ has certain number of entries $n$, it follows the multinomial distribution parameters $n, \textbf p$. This is because $X$ is created by incrementing each coordinate with the vector of probabilities $\textbf p$, $n$ times. Then the joint distribution is found by:

$$\begin{split}f(x_1,...,x_m,z)&=f(x_1,...,x_m|z)\Pr(z)\\ &=\left(\frac{z!}{x_1!...x_m!}p_1^{x_1}...p_m^{x_m}\right)\left(\frac{e^{-\gamma}\gamma^z}{z!}\right)\mathbb 1_{\sum_i x_i=z}\\ &=e^{-\gamma(p_1+...+p_m)}\frac{(p_1\gamma)^{x_1}...(p_m\gamma)^{x_m}}{x_1!...x_m!}\mathbb 1_{\sum_i x_i=z}\\ &=\frac{e^{-\gamma p_1}(p_1\gamma)^{x_1}}{x_1!}...\frac{e^{-\gamma p_m}(p_m\gamma)^{x_m}}{x_m!}\mathbb 1_{\sum_i x_i=z}\end{split}$$

Marginalize $z$ out:

$$\begin{split}f(x_1,...,x_m)&=\sum_{z=0}^\infty\frac{e^{-\gamma p_1}(p_1\gamma)^{x_1}}{x_1!}...\frac{e^{-\gamma p_m}(p_m\gamma)^{x_m}}{x_m!}\mathbb 1_{\sum_i x_i=z}\\ &=\frac{e^{-\gamma p_1}(p_1\gamma)^{x_1}}{x_1!}...\frac{e^{-\gamma p_m}(p_m\gamma)^{x_m}}{x_m!}\end{split}$$

Therefore $X_i\sim \text{Poisson}(\gamma p_i)$ for $i=1,...,m$.

0
On

There are 2 things needed to be proven. One is that components are Poisson distributed, second that they are independent.

Concerning the Poisson distributed: if you think about it, when you look at the first component you just have a Bernoulli distribution with parameter $p_i$ and you take random sum with Poisson distributed number of summands. So $\sum_{i=1}^Z Z_i^1$ where $Z_i^1\sim Alt(p_1)$ are iid. This is well known that this is also Poisson distributed random variable (just google compound poison distribution, it can be easily done using characteristic functions)

Concerning independence: case when $m=2$. Then you can show that $P(X_1=k\mid X_2=l)=P(X_1=k)$. This will imply independence. But here is $$P(X_1=k\mid X_2=l) = P(Z=k+l \mid X_2=l) =P(X_2=l\mid Z=k+l)\frac{P(Z=k+l)}{P(X_2=l)}. $$ When you fill all the numbers (densities of binomial distribution and Poissons distributions) you should obtain what you want.