Probability mass function of 1st event of a Poisson process knowing a time a between 1st and 2nd event

145 Views Asked by At

Say events X1, X2, X3, ... occur in succession according to a Poisson of paramter $\lambda$.
This implies that $Pr(x<X_{1})=Pr(x-x_{1}<X_{2}-X_{1}~~|~~ x_{1}=X_{1})=Pr(x-x_{2}<X_{3}-X_{2}~~|~~x_{2}=X_{2})=...=X_{1})=1-e^{-\lambda.x}$.

I want to know the probability distribution of $X_{1}$ conditional to knowing a certain time $a$ located between $X_{1}$ and $X_{2}$: $Pr(x<X_{1}~~|~~X_{1}<a<X_{2})$. Could you help me ?

my efforts:
I have (incorrectly) called $\mathrm{pmf}_{1~|~X_{1}<a<X_{2}}$ a (sort of) density distribution of $x_{1}=X_{1}$ within $[0,a]$ conditional to $a<X_{2}$. I calculate it this way:
$\mathrm{pmf}_{1~|~X_{1}<a<X_{2}}=\left[ \int_{x_{2}=a}^{+\infty}\mathrm{pmf}_{1}(x_{2}-x_{1}).d(x_{2}-x_{1})\right].\mathrm{pmf}_{1}(x_{1})$,
where $\mathrm{pmf}_{1}(x)=\lambda.e^{-\lambda.x}$ is the probability mass function of the exponential law of the 1st event of a poisson process.
This yields: $\mathrm{pmf}_{1~|~X_{1}<a<X_{2}}=\lambda.a.e^{-\lambda.a}$.
This is independent from $x$, and therefore tells me (I guess) that ( conditional to $X_{1}<a<X_{2}$ ) $X_{1}$ is uniformally distributed within $[0,a]$ (whilst it is obviously not the case in general and I think neither if conditional only to $X_{1}<a$). However the integral from $0$ to $a$ of $\mathrm{pmf}_{1~|~X_{1}<a<X_{2}}$ does not yield 1. Which makes sense since it actually generates:
$P(X_{1}<a<X_{2})=\int_{x_{1}=0}^{a}\mathrm{pmf}_{1~|~X_{1}<a<X_{2}}.dx_{1}=\lambda.a.e^{-\lambda.a}$.
What is the "real" and correct formulation for the pmf or cdf for $X_{1}~~|~~X_{1}<a<X_{2}$ ?

1

There are 1 best solutions below

4
On

For any Lebesgue measurable subset $A\subseteq[0,\infty)$ with finite Lebesgue measure let: $$N_A:=|\{n\mid X_n\in A\}|$$ Then $N_A$ has Poisson distribution with parameter $\lambda\cdot\mu(A)$ where $\mu$ denotes the Lebesgue measure.

(I used $\mu$ for Lebesgue measure because $\lambda$ is in use allready)

Also if $A\cap B=\varnothing$ then $N_A$ and $N_B$ are independent.


By definition: $$\mathsf{P}\left(x<X_{1}\mid X_{1}<a<X_{2}\right)\mathsf{P}\left(X_{1}<a<X_{2}\right)=\mathsf{P}\left(x<X_{1}<a<X_{2}\right)\tag1$$

Here:

$$\begin{aligned}\mathsf{P}\left(X_{1}<a<X_{2}\right) & =\mathsf{P}\left(N_{\left[0,a\right)}=1\right)\\ & =\lambda ae^{-\lambda a} \end{aligned} $$

and:

$$\begin{aligned}\mathsf{P}\left(x<X_{1}<a<X_{2}\right) & =\mathsf{P}\left(N_{\left(0,x\right)}=0\wedge N_{\left[x,a\right)}=1\right)\\ & =\mathsf{P}\left(N_{\left(0,x\right)}=0\right)\mathsf{P}\left(N_{\left[x,a\right)}=1\right)\\ & =e^{-\lambda x}\lambda\left(a-x\right)e^{-\lambda\left(a-x\right)}\\ & =\lambda\left(a-x\right)e^{-\lambda a} \end{aligned} $$

Then $\left(1\right)$ leads to:$$\mathsf{P}\left(x<X_{1}\mid X_{1}<a<X_{2}\right)=\frac{a-x}{a}=1-\frac{x}{a}$$This for $x\in[0,a)$ and we conclude that under condition $X_1<a<X_2$ rv $X_1$ has uniform distribution on $[0,a)$.