Chaining Bernoulli and Poisson

433 Views Asked by At

I have $x_1, \dots x_I$ which denote count events each coming from independent Poisson distributions with parameters $\lambda_1, \dots \lambda_I$.

I would like to look at a random variable that adds them up. I know that the sum of them follows a Poisson with mean $\sum_{i=1}^I \lambda_i$. The twist: Each of the random variables is added up with probability $p$. With probability $1-p$, we add $0$, no matter the outcome of $x_i$.

I'm trying to find a tractable description of the resulting distribution. I couldn't find anything related under "Related distributions" of Poisson or Bernoulli. Can anyone point me to some starting points? If necessary, I'd be fine with forcing $\lambda_i$ to be a constant $\lambda$.

1

There are 1 best solutions below

2
On BEST ANSWER

Under the more convenient condition that $\lambda_i=\lambda$ for every $i$ we can recognize the sum as: $$Y:=Y_1+\cdots+Y_K$$ where the $Y_i$ are iid with Poisson$(\lambda)$ distribution and where $K$ has binomial distribution with parameters $I$ and $p$, and is independent wrt the $Y_i$.

Then:$$P(Y=m)=\sum_{k=0}^IP(Y=m\mid K=k)P(K=k)=\sum_{k=0}^IP(Y_1+\cdots+Y_k=m)P(K=k)$$

If $k$ is fixed then we have an expression for $P(K=k)$ and - as you remarked allready - $Y_1+\cdots+Y_k$ has Poisson distribution with parameter $k\lambda$ so we also have an expression for $P(Y_1+\cdots+Y_k=m)$.

That makes the RHS "ready" to be worked out. Can you do that yourself?


edit:

With $q:=1-p$ I find:

$$\sum_{k=0}^{I}e^{-\lambda k}\frac{\left(\lambda k\right)^{m}}{m!}\binom{I}{k}p^{k}q^{I-k}=$$$$\frac{\lambda^{k}}{m!\left(pe^{-\lambda}+q\right)^{I}}\sum_{k=0}^{I}\binom{I}{k}k^{m}\left(\frac{pe^{-\lambda}}{pe^{-\lambda}+q}\right)^{k}\left(\frac{q}{pe^{-\lambda}+q}\right)^{I-k}=$$$$\frac{\lambda^{k}}{m!\left(pe^{-\lambda}+q\right)^{I}}\sum_{k=0}^{I}\mathbb{E}Z^{m}$$ where $Z$ has binomial distribution with parameters $I$ and $\frac{pe^{-\lambda}}{pe^{-\lambda}+q}$.