Step function of Poisson distributed number of uniformly distributed RVs is Poisson process

241 Views Asked by At

I want to proof this:

Let $N,X_1,X_2,\dots$ be independent RVs, $N$ Poisson distributed and the $X_k\sim\text{Unif}([0,1])$. Then $$ N_t:=\sum_{k=1}^N 1_{[0,t]}(X_k)\quad (t\in [0,1]) $$

is a Poisson process (restricted to $t\in[0,1]$), i.e. $N_t(\omega)$ is an increasing and right-continuous step-function with jumps of size one and $N_t-N_s\sim\text{Pois}(\lambda(t-s))$ for some parameter $\lambda$.

I don't know how to show that the increments are Poisson distributed. I've tried this, but I don't know how to proceed or if this is even the right approach:

$$ N_t-N_s=\sum_{k=1}^N 1_{[0,t]}(X_k)-\sum_{k=1}^N 1_{[0,s]}(X_k)=\sum_{k=1}^N 1_{(s,t]}(X_k) $$ It's obvious that

$$ P(X_k\in(s,t])=t-s\ \text{ and } P(X_k\in[0,s])=s $$

So to calculate $P(N_t-N_s=n)$, $n\in\mathbb{N}$, we need to calculate the probability that exactly $n$ of the $X_k$ are in $(s,t]$ and $N-n$ of the $X_k$ are in $[0,s]$, which is

$$ P(|\{k\leq N : X_k\in (s,t]\}|=n,\ |\{k\leq N : X_k\in[0,s]\}|=N-n) \\ = \binom{N}{n}(t-s)^ns^{N-n} $$

So somehow I ended up with an binomial distribution and there is also missing the probability that $N\geq n$ and I don't know how to combine that and receive a Poisson distribution as result.

Thanks for any help!

2

There are 2 best solutions below

1
On BEST ANSWER

In the understanding that $\binom{k}{n}=0$ if $n$ exceeds $k$ we have:$$P\left(N_{t}-N_{s}=n\mid N=k\right)=\binom{k}{n}\left(t-s\right)^{n}\left(1-t+s\right)^{k-n}$$

Note that you should take $1-t+s$ here (and not $s$).

Also be aware that $N$ is a random variable.

Working this out we find:

$\begin{aligned}P\left(N_{t}-N_{s}=n\right) & =\sum_{k=n}^{\infty}P\left(N=k\right)P\left(N_{t}-N_{s}=n\mid N=k\right)\\ & =\sum_{k=n}^{\infty}e^{-\lambda}\frac{\lambda^{k}}{k!}\binom{k}{n}\left(t-s\right)^{n}\left(1-t+s\right)^{k-n}\\ & =\sum_{k=n}^{\infty}e^{-\lambda}\frac{\lambda^{k}}{n!\left(k-n\right)!}\left(t-s\right)^{n}\left(1-t+s\right)^{k-n}\\ & =\sum_{k=0}^{\infty}e^{-\lambda}\frac{\lambda^{k+n}}{n!k!}\left(t-s\right)^{n}\left(1-t+s\right)^{k}\\ & =e^{-\lambda}\frac{\lambda^{n}}{n!}\left(t-s\right)^{n}\sum_{k=0}^{\infty}\frac{\lambda^{k}\left(1-t+s\right)^{k}}{k!}\\ & =e^{-\lambda\left(t-s\right)}\frac{\left[\lambda\left(t-s\right)\right]^{n}}{n!} \end{aligned} $

0
On

Denote by $\lambda$ for the rate of $N$. For each subset $S \subseteq [0, 1]$, define

$$ \Pi(S) := \sum_{k=1}^{N} \mathbf{1}_{S}(X_k). $$

Then the old definition $N_t$ is the same as $\Pi([0, t])$. Now we prove:

Claim. Suppose that $I_1, \cdots, I_m \subseteq [0, 1]$ are disjoint intervals. Then $\Pi(I_1), \cdots, \Pi(I_m)$ are independent and $\Pi(I_k) \sim \operatorname{Poisson}(\lambda|I_k|)$ for each $k$.

Remark. For OP's specific question, it suffices to consider $n = 1$ case. But it causes little harm to consider and prove the generality, and in fact, this kind of statement is necessary for establishing independent increment condition for Poisson process.


1st Proof. We examine the multivariate MGF. For real $s_1, \cdots, s_m$, define $f(\cdot) = \sum_{k=1}^{m} s_k \mathbb{1}_{I_k}(\cdot)$. Then by the towering property of conditional expectation,

$$ \mathbb{E}\Biggl[ \exp\Biggl\{ \sum_{k=1}^{m} s_k \Pi(I_k) \Biggr\} \Biggr] = \mathbb{E}\Biggl[ \prod_{j=1}^{N} e^{f(X_j)} \Biggr] = \mathbb{E}\Bigl[ \mathbb{E}\bigl[ e^{f(X_1)} \bigr]^{N} \Bigr] = \exp\bigl\{ \lambda(\mathbb{E}[e^{f(X_1)}] - 1) \bigr\}. $$

Moreover, using the fact that $I_k$'s are disjoint, we easily compute that

$$ \mathbb{E}[e^{f(X_1)}] = \int_{0}^{1} e^{f(x)} \, \mathrm{d}x = 1 + \sum_{k=1}^{m} |I_k| (e^{s_k} - 1). $$

Plugging this back, we get

$$ \mathbb{E}\Biggl[ \exp\Biggl\{ \sum_{k=1}^{m} s_k \Pi(I_k) \Biggr\} \Biggr] = \prod_{k=1}^{m} \exp\bigl\{ \lambda|I_k| (e^{s_k} - 1) \bigr\}. $$

By the uniqueness of MGF, this implies the desired claim.


2nd Proof. By adding some extra intervals into consideration, we may assume that $\cup_{k=1}^{m} I_k = [0, 1]$. Now for each $X_j$, define $Y_j$ by

$$ Y_j = [\text{value of index $k$ such that $X_j \in I_k$}] $$

Then note that $Y_j$'s are i.i.d. and $\mathbb{P}(Y_j = k) = |I_k|$ for each $j$ and $k$. Consequently, for each $n$, counting the number of occurrences of each symbol in the list $(Y_1, \cdots, Y_n)$ yields a multinomial distribution.

Now let $n_1, \cdots, n_m \geq 0$ be integers and write $n = \sum_{k=1}^{m} n_k$. Then

\begin{align*} \mathbb{P}\Biggl( \bigcap_{k=1}^{m} \{ \Pi(I_k) = n_k \} \Biggr) &= \mathbb{P} \Biggl( \{N = n\} \cap \bigcap_{k=1}^{m} \{ \text{the number of $k$'s in $(Y_1, \cdots, Y_n)$ is $n_k$} \} \Biggr) \\ &= \frac{\lambda^n}{n!}e^{-\lambda} \cdot \binom{n}{n_1,\cdots,n_m} |I_1|^{n_1} \cdots |I_m|^{n_m} \\ &= \prod_{k=1}^{m} \frac{(\lambda |I_k|)^{n_k}}{n_k!} e^{-\lambda |I_k|}. \end{align*}

From this, we easily check that $\Pi(I_k)$'s are independent and $\Pi(I_k) \sim \operatorname{Poisson}(\lambda |I_k))$.