Understanding a conditioning on two random variables in the Poisson Process

46 Views Asked by At

I'm looking at Theorem 2.2.1 in these notes on the Poisson Process.

Here we have a Poisson process with rate $\lambda$ (which so far I understand as a sequence of iid Exp$(\lambda)$ random variables $X_1,X_2,\ldots, $ simulating the additional time for the nth new arrival after the $n-1$st arrival times).

The theorem in question wishes to show that if $Z$ = time for the next arrival after time $t$ then $Z \sim $ Exp($\lambda$) and $Z$ is independent of $N_\tau$ for all $\tau \leq t$ where $N_\tau = $ number of arrivals by time $\tau$.

The goal is to show $P(Z > z) = e^{-\lambda z}$ so they look at $P(Z > z \mid N_t = n, S_n = \tau)$. This is easily seen to be $e^{-\lambda}$ and then they conclude this also shows indepdendence.

I'm unable to follow this proof completely, so here are my explicit questions.

  1. I have an iffy understanding of conditioning on random variables in general, but I'm unable to understand conditioning on two different random variables like this. I'm able to buy that in general $$P(X \in A) = \int_{-\infty}^{\infty} P(X\in A|Y = y) f_Y(y) dy$$ but here I have two different random variables and one is discrete. How do I convince myself that given $P(Z > z \mid N_\tau = n, S_n = \tau) = e^{-\lambda z}$ we can conclude that $P(Z > z)$ is as well?

  2. Can you give a bit more explanation about exactly why this shows independence of $Z$ from all $N_\tau, \tau \leq t$? I understand $\sigma$-algebras so I'd understand an explanation in terms of $\sigma(N_\tau)$ if necessary.