I am interested in estimating the parameter $\lambda$ of an exponential distribution based on the smallest $n$ out of a total of $N$ observations.
In mathematical terms: let $X$ be distributed according to $\text{Exp}(\lambda)$ and $x_1, \dots, x_N$ be random samples of X ordered, wlog, such that $x_1 < x_2 < \dots < x_N$. How can I estimate the parameter $\lambda$ if I know the first $n$ observations $x_1, \dots, x_n$ and the total number of observations $N$?
I thought I could write down a likelihood function by considering that the first observation is coming from the minimum of a sequence of $N$ i.i.d.r.v. and writing the corresponding probability, then the second from the minimum of the remaining $N-1$ and so on and so forth, but this approach seems pretty cumbersome and impractical if $n$ and $N$ are relatively large. Is there any other way I could approach this problem?
Suppose $Y_1,Y_2,\ldots,Y_N$ is a random sample from an $\text{Exp}(\lambda)$ distribution with common density
$$f(y)=\frac1{\lambda}e^{-y/\lambda}\mathbf1_{y>0}\quad,\, \lambda>0$$
Let $Y_{(i)}$ be the $i$th order statistic, and let $n(<N)$ be fixed.
Your likelihood function is the joint density of $(Y_{(1)},Y_{(2)},\ldots,Y_{(n)})$.
Given $(Y_{(1)},Y_{(2)},\ldots,Y_{(n)})=(x_1,x_2,\ldots,x_n)$, this is just
\begin{align} L(\lambda)&=\frac{N!}{(N-n)!}\prod_{i=1}^n f(x_i)\cdot(P(Y_1>x_n))^{N-n} \\&=\frac{N!}{(N-n)!}\prod_{i=1}^n \left\{\frac1{\lambda}e^{-x_i/\lambda}\right\}(e^{-x_n/\lambda})^{N-n}\mathbf1_{0<x_1<\ldots<x_n} \end{align}
One way to verify this is to consider the joint density of the full set of order statistics and then obtain the density of the first $n$ order statistics as a marginal distribution.
Differentiating the likelihood and solving for the stationary point yields the MLE
$$\hat\lambda=\frac1n\left(\sum_{i=1}^n Y_{(i)}+(N-n)Y_{(n)}\right)$$
Interestingly, $\hat\lambda$ is also the minimum variance unbiased estimator of $\lambda$ based on $Y_{(1)},Y_{(2)}\ldots,Y_{(n)}$.