$(X_1,\ldots,X_n)$ is a random sample extracted from an exponential law of parameter $\lambda$
Calculate the likelihood estimator $\nu$ of $\lambda$.
Then, if $n=2$: establish if $\nu$ is a unbiased estimator
$$L(\lambda: X_1,\ldots,X_n)=\prod_{i=1}^n \lambda \ e^{-\lambda \ x_i} \ \ 1_{(0,+\infty)} \ (x_i)=$$
$$=\prod_{i=1}^n (1_{(0,+\infty)} \ (x_i)) \ \ \lambda^n \ e^{-\lambda \sum_{i=1}^n x_i} \ \ $$
$$\frac{\partial}{\partial \lambda} L(\lambda: X_1,...,X_n)=\prod_{i=1}^n (1_{(0,+\infty)} \ (x_i)) \ \ n \lambda^{n-1} \ e^{-\lambda \sum_{i=1}^n x_i}-\sum_{1=1}^n x_i \ \lambda^n \ \ e^{-\lambda \sum_{i=1}^n x_i}= \ \ $$
$$=\prod_{i=1}^n (1_{(0,+\infty)} \ (x_i)) \ \ \lambda^{n-1} \ e^{-\lambda \sum_{i=1}^n x_i} \ \ (n- \lambda \sum_{i=1}^n x_i) $$
$$\frac{\partial}{\partial \lambda} L(\lambda: X_1,\ldots,X_n) \ge 0 \Longleftrightarrow \lambda \le \frac{1}{\overline{X}}$$
Maximum likelihood estimator of $\lambda$ is $\nu=\frac{1}{\overline{X}}$
If $n=2$, I think that:
$$\nu=\frac{2}{\sum_{i=1}^2 X_i}$$
and
$$\sum_{i=1}^2 X_i \sim \Gamma(2, \lambda)$$
How can I establish if $\nu$ is a unbiased estimator?
Thanks!
First, your MLE calculation can be made much simpler: $$\mathcal L(\lambda \mid \boldsymbol x) = \prod_{i=1}^n \lambda e^{-\lambda x_i} \mathbb 1 (x_i > 0) = \lambda^n e^{-\lambda n \bar x} \mathbb 1 (x_{(1)} > 0),$$ where $\boldsymbol x = (x_1, \ldots, x_n)$ is the sample, $\bar x$ is the sample mean, and $x_{(1)} = \min_i x_i$ is the first order statistic. Then the log-likelihood is $$\ell(\lambda \mid x) = n \log \lambda - \lambda n \bar x + \log \mathbb 1 (x_{(1)} > 0) \propto \log \lambda - \lambda \bar x + \log \mathbb 1 (x_{(1)} > 0).$$ If $x_{(1)} > 0$ is satisfied, then $$\frac{\partial \ell}{\partial \lambda} \propto \frac{1}{\lambda} - \bar x$$ and we have a critical point $$\hat \lambda = (\bar x)^{-1},$$ which is a global maximum.
Now consider $S = n \bar X \sim \operatorname{Gamma}(n,\lambda)$; i.e., $$f_S(s) = \frac{\lambda^n s^{n-1} e^{-\lambda s}}{\Gamma(n)}, \quad s > 0.$$ Then the transformation $Y = g(S) = S^{-1}$ is monotone and therefore $$f_Y(y) = f_S(g^{-1}(y)) \left|\frac{dg^{-1}}{dy}\right| = f_S(1/y) \cdot \frac{1}{y^2} = \frac{\lambda^n e^{-\lambda/y}}{y^{n+1} \Gamma(n)}, \quad y > 0.$$ This is an inverse gamma distribution. It is trivial to see that $$\operatorname{E}[Y] = \int_{y=0}^\infty y f_Y(y) \, dy = \int_{y=0}^\infty \frac{\lambda^n e^{-\lambda/y}}{y^n \Gamma(n)} \, dy = \frac{\lambda}{n-1} \int_{y=0}^\infty \frac{\lambda^{n-1} e^{-\lambda/y}}{y^{(n-1)+1} \Gamma(n-1)} \, dy,$$ and the integrand is now the density of an inverse gamma distribution with parameters $n - 1$ and $\lambda$, thus its integral is equal to $1$. It follows that $$\operatorname{E}[Y] = \frac{\lambda}{n-1}, \quad n > 1,$$ therefore $$\operatorname{E}[\hat \lambda] = \operatorname{E}[n Y] = \frac{n \lambda}{n-1},$$ so $\hat \lambda$ is a biased estimator.
(Note. I chose to explicitly state the distribution of the reciprocal of the sample total for pedagogical reasons, but it is just as easy to compute its expectation without doing so. We can use this result to compute the variance of such an estimator and consider its asymptotic properties.)