Splitting of Poisson processes and Bernoulli trials

1.8k Views Asked by At

When a Poisson process is split according to the results of independent Bernoulli trials, two Poisson processes are obtained. It is trivial to prove that these two processes are independent. I was examining the converse. If a Poisson process is split in two Poisson processes, is it necessary that the split takes place according to the results of Bernoulli trials? I was able to prove this on condition that the two processes are independent. Is it possible to prove this without the previous condition?

3

There are 3 best solutions below

2
On BEST ANSWER

I think the following serves as a counterexample:

Let $N$ be a rate 2 Poisson process on $\bf{R}^+$. Independently within each unit interval $I=[k,k+1)$, split the process into $N=N_1+N_2$ by uniformly randomly selecting $x$ of the $N(I)=n$ jumps to assign to $N_1$ with the remaining $n-x$ assigned to $N_2$ where $x$ is determined by a conditional pdf $$f(x|n) = Pr(N_1(I)=x|N(I)=n)$$ as given below.

Note that, if we were to take $f(x|n)$ to be the binomial distribution with parameters $n$ and $1/2$, we would have the "usual" independent Bernoulli split.

Instead, let's take: \begin{align} f(0|0) &= 1 \\ f(0|1) &= 1/4, f(1|1) = 3/4 \\ f(0|2) &= 1/2, f(1|2) = 1/2, f(2|2) = 0 \\ f(0|3) &= 1/8, f(1|3) = 0, f(2|3) = 3/4, f(3|3) = 1/8 \\ f(x|n) &= Binom(n,1/2) \hbox{ otherwise} \end{align}

It should be clear that the resulting split is not generated by independent Bernoullis.

I claim that $N_1$ and $N_2$ are Poisson processes of rate 1. I think it's clear that we need only establish, for an arbitrary interval $I=[k,k+1)$, that $N_1(I)$ and $N_2(I)$ are Poisson r.v.s of mean 1, given the nature of the rest of the construction (uniform random selection of points from $N(I)$ conditioned on the total $N(I)=n$, and independent construction within each interval $I$).

Write $g(x)$ for the marginal pdf of $N_1$, and $h(n)$ for the pdf of $N$ and note: $$g(0)=\sum_x f(0|n)h(n) = e^{-2} (1*1 + 1/4*2 + 1/2*4/2 + 1/8*8/3! + 1/16*16/4! + \cdots) = e^{-1}$$ It can similarly be shown that $g(1) = e^{-1}$, $g(2)=e^{-1}/2$, etc. and $N_1(I)$ is Poisson with mean 1. Similarly, it can be shown that $N_2(I)$ is Poisson with mean 1. So, that should do it.

If you're curious how I constructed $f(x|n)$ above, I considered the infinite matrix for the joint distribution $f(x,y)$ of $N_1(I)$ and $N_2(I)$ under the usual Bernoulli split and realized that I could permute the top left corner by added the 3x3 matrix: $$\begin{matrix} 0 & \epsilon & -\epsilon \\ -\epsilon & 0 & \epsilon \\ \epsilon & -\epsilon & 0 \end{matrix}$$ in such a way that the column and row totals (i.e., the marginal pdfs of $N_1(I)$ and $N_2(I)$) and all the secondary diagonals (those moving from lower left to upper right, corresponding to the pdf of $N(I)$) were unchanged.

4
On

If the two random variables $X_1 \sim Pois(rate = \lambda_1)$ and $X_2 \sim Pois(\lambda_2)$ are independent. Then $Y = X_1 + X_2 \sim Pois(\lambda_1 + \lambda_2).$

It matters that $X_1$ and $X_2$ are independent, but not the mechanism by which either arose. The proof using moment generating functions is easy.

Intuitively, suppose $X_1$ describes counts from a radioactive solution into a counter and $X_2$ describes counts from another solution. Pour the two solutions into the same beaker. Counts from that beaker will be described by $Y.$

Here is a simulation by demonstration in R statistical software, using a million realizations of each constituent random variable, where $\lambda_1 = 4$ and $\lambda_2 = 5:$

m = 10^6;  x1 = rpois(m, 4);  x2 = rpois(m, 5);  y = x1 + x2
mean(x1);  mean(x2);  mean(y)
## 4.003022  # aprx 4,  consistent with POIS(4)
## 4.997015  # aprx 5,  POIS(5)
## 9.000037  # aprx 9,  and POIS(9)
var(x1);  var(x2);  var(y)
## 4.004901  # aprx 4
## 4.995319  # aprx 5
## 9.011688  # aprx 9

In the figure below, histograms show simulated distributions, and dots atop histogram bars show exact Poisson probabilities.

enter image description here

0
On

Arrivals in a Poisson process occur independently of any other arrivals, at a constant average rate, say $\lambda$.   If the arrivals are split into two categories according to the result of independent Bernoulli trials, with constant rate $p$, then you will have two independent Poisson processes with rates $p\lambda$ and $(1-p)\lambda$.

This can be shown by establishing that, if $X,Y$ are the count of arrivals of the two categories, then the conditional distribution of arrivals in a category among given count of total arrivals, will be Binomially distributed.   (Why?)   If that is so, then:

$$\begin{align} \mathsf P(X=k, Y=h) ~& =~ \mathsf P(X+Y=k+h)~\mathsf P(X=k\mid X+Y=k+h) \\[1ex] & =~ \dfrac{\lambda^{k+h}~\mathsf e^{-\lambda}}{(k+h)!}\cdotp\dfrac{(k+h)!~p^k~(1-p)^h}{k!~h!} \\[1ex] & =~ \dfrac{\lambda^{k+h}~\mathsf e^{-\lambda}~p^k~(1-p)^h}{k!~h!} \\[2ex] \mathsf P(X=k) ~&=~ \sum_{n=k}^\infty \mathsf P(X+Y=n)~\mathsf P(X=k\mid X+Y=n) \\[1ex] ~&=~ \sum_{n=k}^\infty \dfrac{\lambda^n~\mathsf e^{-\lambda}}{n!}\cdotp\dfrac{n!~p^k~(1-p)^{n-k}}{k!~(n-k)!} \\[1ex] & =~ \dfrac{(p\lambda)^k~\mathsf e^{-p\lambda}}{k!} \\[2ex]\mathsf P(Y=h) ~&=~ \dfrac{\bigl((1-p)\lambda\bigr)^h~\mathsf e^{-(1-p)\lambda}}{h!} \\[3ex] \therefore~\mathsf P(X=k,Y=h)~&=~ \mathsf P(X=k)~\mathsf P(Y=h) \end{align}$$

And we conclude that the Poisson processes are thus independent.