Probability generating functions, Poisson and Bernoulli

261 Views Asked by At

I'm struggling a bit probability generating functions, more specifically to find the PGF of $Z = \sum_{i=1}^{N} X_i$ where $N \sim poi(\lambda)$ and $X_1, X_2, ... \sim Bernoulli(p)$

I don't really know how to begin solving this problem, my first thought was to just take the expected value of $poi(\lambda)$ which is $\lambda$ and use that as my N, but that does not feel like the correct way. I cant really see how I am supposed to use the Bernoulli in this.

Would anyone like to help me solve this?

Thanks!

3

There are 3 best solutions below

0
On

If $N$ is a nonnegative integer-valued random variable with probability generating function $P_N(s)$ and $X_n$ a sequence of i.i.d. nonnegative integer-valued random values independent of $N$ with probability generating function $P_{X_1}(s)$, then the random sum $$ S_N := \sum_{i=1}^N X_i $$ has probability generating function given by the composition $P_{S_N}(s) = P_N(P_{X_1}(s))$ - it is a good exercise to show this if you have not already.

In this case, $N\sim\mathsf{Pois}(\lambda)$ so $$ \mathbb E[s^N] = \sum_{k=0}^\infty e^{-\lambda}\frac{\lambda^k}{k!}s^k = e^{\lambda(s-1)} $$ and $X_1\sim \mathsf{Ber}(p)$ so $$ \mathbb P_{X_1}(s)) = 1-p + ps = 1-p(1-s). $$ It follows that $$ P_{S_N}(s) = e^{\lambda((1-p(1-s))-1)}. $$

0
On

By definition the probability-generating function of a random variable $Z$ is given as $G_Z(x) := \mathbb E[x^Z]$. We calculate, using that the $X_1, X_2, \dots,$ are independet and identically distributed and the $N$ is independet of the $X_k$: $$\mathbb E[x^Z] =\mathbb E[\sum_{k=0}^\infty 1_{N=k}x^{X_1+ \dots + X_k}]$$ where we just multiplied by $1= \sum_{k=0}^\infty 1_{N=k}$. And on ${N=k}$ the exponent is just $X_1+ \dots + X_n$. By dominated convergence on $\{\lvert x \rvert \le 1\}$ we may interchange the expectation with the infinite sum to get

$$= \sum_{k=0}^\infty \mathbb E[1_{N=k}x^{X_1+ \dots +X_k}]$$ Then by the independence of $N$ from the $X_k$ we can split the expectation up and use $\mathbb E[1_{N=k}] = \mathbb P(N=k)$ to arrive at $$\sum_{k=0}^\infty \mathbb P(N=k) \mathbb E[x^{X_1+ \dots X_k}]$$ Now if the $X_k$ are independent we can do two possible things. One idea would be to then use $\mathbb E[x^{X_1+ \dots +X_k}] = \mathbb E[x^{X_1}] \cdot \dots \cdot \mathbb E[x^{X_k}] = \left ( \mathbb E[x^{X_1}]\right )^k$ because the $X_i$ have the same distribution. Then we get $$=\sum_{k=0}^\infty \mathbb P(N=k) \mathbb E[x^{X_1}]^k = \mathbb E\left [\mathbb E[x^{X_1}]^N\right ] = G_N(\mathbb E[x^{X_1}]) = G_N(G_{X_1}(x))$$

Then we use that $G_N(y) = e^{\lambda (y-1)}$ for a Pois$(\lambda)$-distributed random variable and that $G_X(y) = (1-p)+ py$ for a Bernoulli-distributed random variable. We arrive finally at $$\mathbb E[x^Z] = e^{\lambda((1-p)+px -1)}=e^{\lambda p(x-1)}$$

0
On

Let $B(u)$ denote the PGF of $\text{Bernoulli}(p)$ and let $G(u)$ denote the PGF of $\text{Poisson}(\lambda)$.

Then: $$\mathbb E[u^Z\mid N=n]=\mathbb Eu^{\sum_{i=1}^nX_i}=B(u)^n$$ From this we conclude that:$$\mathbb E[u^Z\mid N]=B(u)^N$$

and consequently:$$\mathbb Eu^Z=\mathbb E[\mathbb E[u^Z\mid N]]=\mathbb EB(u)^N=G(B(u))$$ So things are reduced to finding $B(u)$ and $G(u)$.