Sum of poisson random variable

235 Views Asked by At

Let $X_1 \approx Poi(\lambda)$ be a Poisson random variable with parameter $\lambda$ and $Y_1,Y_2,\cdots$ be $Ber(p)$ Bernoulli random variables defined on the same probability space such that $X,Y_1,Y_2,\cdots$ are independent. Show that $X:=\sum_{i=1}^{X_1}Y_i$ also has the law of $Poi(p\lambda)$ and $X_1 − X$ has the law of $Poi((1 − p)\lambda)$ and is independent of $X$.

The solution I was given is $$\Bbb P[X=k,X_1-X=l]=\Bbb P[X_1=k+l,X=k] $$ $$ = \Bbb P[X_1=k+l,\sum_{i=1}^{k+l}Y_i=k]$$ $$ = \Bbb P[X_1=k+l]\Bbb P[\sum_{i=1}^{k+l}Y_i=k]$$ $$ = \frac{\lambda^{k+l}}{(k+l)!}e^{-\lambda}{k+l \choose k}p^k(1-p)^l$$ $$=\frac{(p\lambda)^{k}}{k!}e^{-p\lambda}\frac{((1-p)\lambda)^{l}}{l!}e^{-(1-p)\lambda}$$ Hence, the pair $(X, X_1−X)$ has the law of two independent random variables with respective laws $Poi(\lambda p)$ and $Poi(λ(1 − p))$.

I don't understand the conclusion and the way they are doing this, we can just switch things in the last equality to affirm that $X$ has the law of $Poi(λ(1 − p))$... Can someone explain what is being done here ?

2

There are 2 best solutions below

2
On

If $U,V$ are discrete r.v.'s with $P(U=u,V=v)=f(u)g(v)$ for some mass functions $f$ and $g$ then $U$ and $V$ are independent, $P(U=u)=f(u)$ and $P(V=v)=g(v)$. To see this first sum over all $v$ to get $P(U=u)=f(u)$. Sum over $u$ to get $P(V=v)=g(v)$. Finally go back to the original equation to get $P(U=u, V=v)=P(U=u)P(V=v)$.

0
On

It is because, by Law of Total Probability and the Taylor expansion: $e^x=\sum\limits_{j\in\Bbb N}\dfrac{x^j}{j!}$

$$\begin{align}\mathsf P(X=k)&=\sum_{\ell}\mathsf P(X=k, X_1-X=\ell)\\&=\sum_{\ell}\dfrac{(p\lambda)^k\mathrm e^{-p\lambda}}{k!}\dfrac{((1-p)\lambda)^\ell \mathrm e^{-(1-p)\lambda}}{\ell!}\mathbf 1_{k\in\Bbb N}\mathbf 1_{\ell\in\Bbb N}\\&=\dfrac{(p\lambda)^k\mathrm e^{-p\lambda}}{k!}\mathbf 1_{k\in\Bbb N}\cdot\mathrm e^{-(1-p)\lambda}\sum_{\ell=0}^\infty\dfrac{((1-p)\lambda)^\ell }{\ell!}\\&=\dfrac{(p\lambda)^k\mathrm e^{-p\lambda}}{k!}\mathbf 1_{k\in\Bbb N}\end{align}$$

Therefore $X\sim\mathcal{Pois}(p\lambda)$

And likewise $\mathsf P(X_1-X=\ell) =\dfrac{((1-p)\lambda)^\ell \mathrm e^{-(1-p)\lambda}}{\ell!}\mathbf 1_{\ell\in\Bbb N}$

Therefore $X_1-X\sim\mathcal{Pois}((1-p)\lambda)$ and further $X$ and $X_1-X$ are independently distributed because we have:$$\mathsf P(X=k,X_1-X=\ell)=\mathsf P(X=k)~\mathsf P(X_1-X=\ell)$$