Is This Continuous Version of the Poisson Distribution Closed Under Addition?

624 Views Asked by At

According its Wikipedia page the CDF of the Poisson distribution is: $$\operatorname{CDF}_0(k) = \frac{\Gamma(\lfloor k+1\rfloor, \lambda)}{\Gamma(\lfloor k+1\rfloor)},$$ with $\Gamma(a, x)$ the incomplete gamma function. This admits an immediate generalization to continuous variables with $k=x\ge0$ by dropping the floor functions in the numerator and denominator, giving (after a shift by $1$): $$\begin{align}\operatorname{CDF}(x) & = \frac{\Gamma(x, \lambda)}{\Gamma( x)},\ \mathrm{and} \\ \operatorname{PDF}(x)& = \frac{\int_\lambda^\infty \ln(t)\, t^{x -1}\operatorname{e}^{-t} \operatorname{d}t}{\Gamma( x)} - \frac{\Gamma(x, \lambda) }{\Gamma( x)}\psi^0(x),\end{align}$$ with $\psi^0(x)$ the digamma (or polygamma of order $0$) function.

Is it possible to show that if $x$ and $y$ are random variables distributed according to $\operatorname{CDF}$ with parameters $\lambda_x$ and $\lambda_y$, respectively, then $x+y$ is distributed according to the same $\operatorname{CDF}$ (presumably with parameter $\lambda_x + \lambda_y$)? If so, how? If not, where does it fail?

1

There are 1 best solutions below

2
On

At the moment, this is just an alternative description of your distribution and a rough idea why your conjecture is false. I will try to turn it into a complete negative answer later.

It turns out that the distribution you defined is the marginal distribution of the value at $\lambda$ of the so-called inverse $\Gamma$-process $Z$ with parameters $(1,1)$. As the name suggests, the latter is inverse $$ Z(t) = \inf\{s\ge 0: X(s) \ge t\} $$ of a $\Gamma(1,1)$-process $X$. In turn, $X$ is a subordinator (non-negative increasing Lévy process) such that for any $t>0$, $X(t)$ has the Gamma distribution $\Gamma(t,1)$, i.e. $$ \mathsf P(X(t)\in dx) = \frac{x^{t-1}}{\Gamma(t)}e^{-x}\mathbf{1}_{x>0}. $$

To verify the claim, note that $X$ is right-continuous and increasing, so $$ \mathsf P(Z(\lambda)\le t) = \mathsf P(X(t)\ge \lambda) = \frac{1}{\Gamma(t)}\int_{\lambda}^\infty x^{t-1} e^{-t} dt = \frac{\Gamma(t,\lambda)}{\Gamma(t)}. $$

It might be tempting to say that $Z$ itself a Lévy process (which would confirm your conjecture): once we hit level $\lambda_1$, hitting level $\lambda_1 + \lambda_2$ from here would take the same time as hitting $\lambda_2$ starting from zero; moreover, thanks to the Markov property, the future evolution is independent of the past. This all would be true if $X$ were continuous. But it is not. Moreover, it is a pure jump process. This means that not only the above argument fails, but also overshoot is positive with probability one, so one needs strictly less time than supposed to reach $\lambda_1+\lambda_2$. In other words, $Z(\lambda_1+\lambda_2)$ is strictly less (in the sense of stochastic dominance) than the sum of independent copies of $Z(\lambda_1)$ and $Z(\lambda_2)$. This already proves that your conjecture cannot be true with parameters adding up.

However, by looking at the tails, the parameters must add up in your conjecture. Indeed, extreme values of $Z(\lambda_1 + \lambda_2)$ should mean that $X$ had no large jumps until it reached $\lambda_1+\lambda_2$. Therefore, the overshoot over $\lambda_1$ should not be large, so we have to walk the distance of size approximately $\lambda_2$ from there. In other words, the tail of $Z(\lambda_1+\lambda_2)$ should be asymptotically equivalent to that of the sum of independent copies of $Z(\lambda_1)$ and $Z(\lambda_2)$. However, I have no precise argument at the moment.


By the way, this description shows another relation between this distribution and the Poisson distribution. Namely, the discretized version of $Z$, $$ Z_d(t) = \inf\{\text{ integer }t\ge 0: X(t)\ge t\}, $$ is easily seen to have $\mathrm{Pois}(t)$ distribution plus 1. I made this extra term bold to emphasize once more that these hitting times cannot be additive in law: we don't have $Z_d(t_1 + t_2) = Z_d(t_1) \oplus Z_d(t_2)$, but rather $Z_d(t_1 + t_2) = Z_d(t_1) \oplus Z_d(t_2)-1$ (this is also a good illustration to the above-mentioned subadditivity in law).