Doubts about Proof of Durrett Theorem 3.7.4. Thinning of Poisson Process

299 Views Asked by At

I am having trouble understanding Durrett's logic in his proof of the thinning of the Poisson process.

Here is the statement of the Theorem: $N_j(t)$ are independent rate $\lambda P(Y_i = j)$ Poisson processes. $N(t)$ is assumed to be a Poisson process with rate $\lambda$ (e.g., the number of cars arrive at a store at time t), and $N_i(t)$ is defined as the number of $i \leq N(t)$ with $Y_i = j$, where $Y_i$ is an additionally defined property associated with the arrivals (e.g., the number of passengers in the arrived cars).

In Durrett's proof, he first proved a simple case where this additional property $Y_i$ is binary. He defined $P(Y_i = 1) = p$ and $P(Y_i = 2) = 1 - p$, which is fine.

What I am having problems with are:

(1) He asserted that $N_1(t)$ and $N_2(t)$ are Poisson processes without proving it.

This is what I do not understand - isn't it part of the statement to be proved that $N_1(t)$ and $N_2(t)$ are Poisson processes? Why is this true?

(2) He proved that if $X_i = N_i(t + s) - N_i(s)$, $X_1 = j$ and $X_2 = k$, there must be $j + k$ arrivals between $s$ and $s + t$, so $P(X_1 = j, X_2 = k) = e^{\lambda t} \frac{(\lambda t)^{j+k}}{(j+k)!}\frac{(j+k)!}{j!k!}p^j (1-p)^k = e^{\lambda pt} \frac{(\lambda p t)^j}{j!}e^{\lambda(1-p)}t \frac{(\lambda (1-p) t)^j}{j!}$. Then he asserted that $X_1 = Poisson(\lambda pt)$ and $X_2 = Poisson(\lambda (1-p)t)$.

My question is, why can you assert that because $P(X_1 = j, X_2 = k)$ factors into two Poisson distributions, so $X_1$ and $X_1$ are independent, $X_1$ must be $Poisson(\lambda pt)$ and $X_2$ must be $Poisson(\lambda (1-p)t)$?

Thank you very much.

Here is the statement and the full proof of the theorem: (the questions are highlighted in blue) enter image description here enter image description here

2

There are 2 best solutions below

0
On BEST ANSWER

(1) The first highlighted blue part is perhaps worded a little misleadingly, but it does not claim anything without proof. Read it as "there are only two processes to consider, $N_1(t)$ and $N_2(t)$, and we will prove that they are Poisson processes."

(2) This is the part where we prove that $N_1(t)$ and $N_2(t)$ are Poisson, so I'm not sure why you thought that this was ever claimed without proof.

In general, suppose we can factor $\Pr[X=a \land Y=b]$ as $f(a) \cdot g(b)$ where $f$ and $g$ are probability mass functions. Then we have \begin{align} \Pr[X=a] &= \sum_{b \in \text{dom}(Y)} \Pr[X=a \land Y=b] \\ &= \sum_{b\in \text{dom}(Y)} f(a) \cdot g(b) \\ &= f(a) \sum_{b\in \text{dom}(Y)} g(b) \\ &= f(a). \end{align} In the last step, the sum over $b$ simplifies to $1$ precisely because $g$ is a probability mass function.

We conclude that $f(a)$ really is the probability mass function of $X$; similarly, $g(b)$ really is the probability mass function of $Y$. Finally, because $\Pr[X=a \land Y=b] = \Pr[X=a] \cdot \Pr[Y=b]$, $X$ and $Y$ are independent.

0
On

(1) You are right- it is premature to state that $N_1(t)$ and $N_2(t)$ are Poisson processes before proving it. But he does prove it later.

(2) If $X,Y$ are random variables taking values in $0,1,2,3, \ldots$, and $P(X=j, Y=k)=p(j)q(k)$ for all $k,j$ where $p(\cdot)$ and $q(\cdot)$ are probability distributions, then $$P(X=j)=\sum_k p(j)q(k)=p(j)\sum_k q(k) =p(j)$$ and similarly $$P(Y=k)=q(k)\,.$$