Probability equals rate $\times$ time?

825 Views Asked by At

Suppose a random event occurs at a rate $r$ (that's the average number of events per unit time).

I have seen a number of books and papers claim that the probability $P$ of one or more events occurring in a time interval $\delta t$ is $$ P=r\,\delta t$$

Superficially this seems untrue since $r\,\delta t$ should be the number of events rather than probability. Moreover, $\delta t>r^{-1}$ gives $P>1$. On the other hand, the linearity is intuitive.

My question: is this result generally true? If not, then under what circumstances may it be assumed true?

1

There are 1 best solutions below

0
On BEST ANSWER

The statement is true in the limit of $\delta t \to 0$. In that case, the probability of two events goes down as $(\delta t)^2$ which is ignorable in comparison with $r\delta t$.

Strictly speaking, the result is only true at some time $t$ if the rate per unit time at time $t$ does not go to infinity. Thus if $$r(t) = \frac{1}{(t-4)^2}$$ then at $t=4$ you really can't say that the probability of one or more events occuring in a tiny interval $\pm \frac12 \delta t$ around $t=4$ is the same as the probability of one event occuring in that interval.

Other than that caveat, the result can be assumed true.