Random truncation with poisson process

91 Views Asked by At

Suppose, a Poisson process is observed in fixed time intervals, e.g. we record the output of a Geiger counter during a time interval $\Delta t$ started randomly. The decay rate is much larger than this time interval, so $$T \ll \frac1\lambda$$ Now, I know that for a non-truncated observations, the time difference between two consecutive events follows an exponential distribution $$p(\delta t) = \frac1\lambda e^{-\lambda \delta t} $$

For a fixed truncation, I also know that the probability density looks the same, just truncated and normalized.

Now, the problem I can't figure out is: The truncation is random. If the first event is in the beginning of our measure period, then it is truncated at $\delta t \lesssim T$. But if the first event of a period is near the end, it is truncated much more, since there is less time in which a second event might occur.

I know, I could obtain $\lambda$ from the number of events per $T$, or by concatenation of several independent measurements, even though they are not consequent. But what distribution would $\delta t$ follow in that case and could one determine $\lambda$ from such a measurement?

Edit: My intuition was that it does not change, yet neither real life measurements, nor the result from a random number generator agree with that here. To test this, I ran the following code:

Generate time stamps with a difference sampled from $\lambda e^{-\lambda x}$ for $\lambda \in (1, 5, 10)$

ts1 = np.cumsum(np.random.exponential(1,1000000))
ts5 = np.cumsum(np.random.exponential(5,1000000))
ts10 = np.cumsum(np.random.exponential(10,1000000))

Loop over windows of length 1 and count the number of events in each time window, as well as the time difference if more than one event exists:

deltaT1 = []
deltaT5 = []
deltaT10 = []
nevents1 = []
nevents5 = []
nevents10 = []

for i in range(len(ts1)):
    n1 = len(ts1[(ts1>i)&(ts1<(1+i))])
    n5 = len(ts5[(ts5>i)&(ts5<(i+1))])
    n10 = len(ts10[(ts10>i)&(ts10<(i+1))])
    nevents1.append(n1)
    nevents5.append(n5)
    nevents10.append(n10)
    while n1>1:
        deltaT1.append((ts1[(ts1>i) & (ts1<(1+i))][n1-1]- ts1[(ts1>i) & (ts1<(1+i))][n1-2]))
        n1-=1
    while n5>1:
        deltaT5.append((ts5[(ts5>i) & (ts5<(1+i))][n5-1]- ts5[(ts5>i) & (ts5<(1+i))][n5-2]))
        n5-=1
    while n10>1:
        deltaT10.append((ts10[(ts10>i) & (ts10<(1+i))][n10-1]- ts10[(ts10>i) & (ts10<(1+i))][n10-2]))
        n10-=1

The mean number of events does follow a Poisson distribution with mean $1/\lambda$:

print(np.mean(nevents1), np.mean(nevents5), np.mean(nevents10))
Output: 1.0 0.200107 0.099953

However, the mean values of the different deltaT surprise me:

print(np.mean(deltaT1), np.mean(deltaT5), np.mean(deltaT10))
Output: 0.28116357748602905 0.32065439473099616 0.32102902250398857

And as a histogram it does not seem follow an exponential pdf. The code for the histograms:

plt.hist(deltaT1, bins=10, range=(0,1), label="$\lambda=1$", log=True, histtype="step")
plt.hist(deltaT5, bins=10, range=(0,1), label="$\lambda=5$", histtype="step", log=True)
plt.hist(deltaT10, bins=10, range=(0,1), histtype="step", log=True, label="$\lambda=10$")
plt.xlabel("$\delta t$ between consequent events within intervals of 1")
plt.legend()