What is the probability that another event will happen?

104 Views Asked by At

I'm trying to solve the following problem:

A number of events, $m$, happen within a time $\tau$ with expectation value $\langle m \rangle = \mu$. What, on average, is the probability $P_a$ that another event will happen within $\tau$ if we do not know where in time we are or how many events have already happened (denoted $n$)?

First approach: This is based primarily on intuition. I figure that, on average, we expect to find ourselves at time $\tau/2$ with $\langle n \rangle \sim \mu/2$ events already haven taken place. Also, the events are assumed to be Poisson-distributed. In this case

\begin{align} P_{a_1}&=P(m>\langle n \rangle) \\ &=\sum_{m=\langle n \rangle+1}^\infty P(m) \\ &=\sum_{m=\langle n \rangle+1}^\infty \frac{\mu^m}{m!} e^{-\mu} \\ &=1-\frac{\Gamma\left(\langle n \rangle+1,\mu\right)}{\Gamma\left(\langle n \rangle+1\right)}, \end{align} where the gamma-function in the numerator is the incomplete gamma function. One obvious problem with this is that the lower limit in the sum, $\langle n \rangle+1$, is not necessarily an integer, but making the limit, for instance, $\left \lfloor{\langle n \rangle}\right \rfloor+1$ instead seems to introduce other problems. However, a thing about this expression is that $\lim_{\mu \to \infty}[P_{a_1}]=1$, as seems to be intuitively right.

Second approach: This is my attempt at being more formal. \begin{align} P_{a_2}&=\langle P(m> n ) \rangle \\ &=\sum_{n=0}^\infty P(m> n ) P(n) \\ &=\sum_{n=0}^\infty P(m> n ) \frac{\langle n \rangle^n}{n!} e^{-\langle n \rangle} \\ &=\sum_{n=0}^\infty \left(\frac{\langle n \rangle^n}{n!} e^{-\langle n \rangle} \sum_{m=\langle n \rangle+1}^\infty \frac{\mu^m}{m!} e^{-\mu}\right), \end{align} which unfortunately gives values $\gg 1$.

Both attempts seem to be flawed, but where is my mistake(s)? Expect for the somewhat ill-defined lower limit in the sum, I kinda liked the first (intuitive) approach. Should this not give the same result as the second (more formal) approach, if both are carried out correctly? What would be a correct approach to solving this problem?

Thanks!

1

There are 1 best solutions below

2
On BEST ANSWER

That the events are assumed to be Poisson-distributed should be in the problem statement, not in one of the attempts; the problem is ill-defined as it stands.

In the second attempt, I don't know what you're averaging over in $\langle P(m\gt n)\rangle$ (just as I don't know what you mean by "the probability on average" in the problem statement). In case you actually meant $P(m\gt n)$, the problem is that this is $\sum_nP(m\gt n\mid n)P(n)$, not $\sum_nP(m\gt n)P(n)$.

It's not clear to me why the first attempt should be expected to yield the right result; it's at best a heuristic.

If I understand the problem correctly, you're trying to calculate the probability that after a time $t$ uniformly randomly chosen in the unit interval, at least one more event will occur in a Poisson process with rate $\mu$ before the end of the interval. This is

$$ \int_0^1\left(1-\mathrm e^{-\mu(1-t)}\right)\mathrm dt=1-\int_0^1\mathrm e^{-\mu t}\mathrm dt=1+\frac1\mu\left(\mathrm e^{-\mu}-1\right)=\frac12\mu+O\left(\mu^2\right)\;. $$

Edit in response the comment:

I'll switch to more conventional notation since this would become rather confusing in yours. So let $M$ be the total number of events in a Poisson process with rate $\mu$ and $N$ the number of events that occur before a time $T$ that's uniformly randomly distributed on the unit interval and independent of $M$. Then

\begin{align} P(M\gt N\mid N=n)&=P(M\gt n\mid N=n)\\ &=\frac{P(M\gt n\cap N=n)}{P(N=n)}\\ &=\frac{\int_0^1\left(1-\mathrm e^{-\mu(1-t)}\right)(\mu t)^n\mathrm e^{-\mu t}/n!\,\mathrm dt}{\int_0^1(\mu t)^n\mathrm e^{-\mu t}/n!\,\mathrm dt}\\ &=1-\frac{\mu^{n+1}\mathrm e^{-\mu}}{\Gamma(n+1)-\Gamma(n+1,\mu)}\;. \end{align}

Here, as above, $1-\mathrm e^{-\mu(1-t)}$ is the probability that an event occurs in the time remaining after $t$, and we integrate over $t$ from $0$ to $1$ because of the uniform distribution of $T$ over that interval.