Consider a Poisson process. We start observing it at a time, $u_1$ in its time-line, until the time $u_2 = u_1 + u$. There are $m$ of these processes operating independently of each other. This is shown in the figure below.
I want to estimate the rate of events, $\lambda$, which is the sole parameter of this Poisson process from the data for these $m$ processes.
One way to do it is to count the number of events lying inside each interval and find the likelihood of these counts using the Poisson distribution. It isn't hard to show that the maximum likelihood estimator for $\lambda$ is:
$$\lambda = \frac{n}{m u} \tag{1}$$
where $n$ is the total events lying inside our interval from all the $m$ processes. This turns out to be an unbiased estimator. But it isn't the only unbiased estimator (see below for another).
I conjecture that the estimator given by equation (1) is the uniformly minimum variance unbiased estimator (umvue) for the Poisson processes $\lambda$ parameter. Is there a way to prove this conjecture or refute it?
Another unbiased estimator can be motivated by taking the exponential distribution and plugging in the likelihood of the intervals between the events. Taking into account that the first interval is always censored, we get:
$$\lambda = \frac{\sum\limits_{i=1}^m \max(n_i-1,0)}{\sum\limits_{i=1}^m (t_i-u_1)}\tag{2}$$
Here, $n_i$ is the number of events from process $i$ that lie inside our interval and $t_i$ is the time at which the last event in the observation interval occurred for process $i$ and $u_1$ if no events occurred in the interval.
The simulation here demonstrates that both of these are unbiased estimators of $\lambda$. However, the variance of the first estimator is a lot lower.
