Given time $t$ in minutes, that is exponentially distributed with density $f(t)=\phi e^{-\phi t}$, $t>0$ and $\phi$ is an unknown parameter. Given $n$ observations $t_1,t_2,...,t_n$ find the probability maximum estimate $\hat \phi$ for the unknown parameter $\phi$.
(A)
Find a numeric value for $\hat \phi$ when we have the following 10 observations:
\begin{align}\\ & t_i = 2.0, 1.4, 2.0, 0.5, 0.7, 2.0, 1.3, 1.1, 1.8, 0.2 \\ \end{align}
For this I have found that:
For $n=10$ we have:
\begin{align} \\ & \hat \phi = \frac{n}{\sum_{j=0}^{n} x_j} = \frac{10}{13} \approx 0.77 \\ \end{align}
(B)
Is $\hat \phi$ a correct expected estimator for $\phi$?
I have:
$E(\hat \phi) = E(\frac{n}{\sum_{j=0}^{n} x_j})= \frac{n}{E(X_1)} + \frac{n}{E(X_2)} + ... + \frac{n}{E(X_{10})}$
and
\begin{align} \\ & E(X) = \frac{1}{\lambda} \\ & V(X) = \frac{1}{\lambda ^2} \end{align}
for exponential distributed values. But how do I find $\lambda$?
Part 1: Finding the MLE
Using your notations, let $\widehat{\phi}$ be the maximum likelihood estimate. The maximum likelihood estimator uses $\widehat{\phi} = \text{argmin}_\phi \text{ Loss}(y_i - \widehat{y_{i,\phi}})$
Assuming some $\phi$, how would you predict each of the values? you would predict the mean. Thus each prediction you would make is $\frac{1}{ \phi}$
From here, we have \begin{align*} \widehat{\phi} &= \text{argmin}_\phi\quad \sum\limits_i \text{ Loss}(y_i - \frac{1}{\phi})\\ &= \text{argmin}_\phi\quad \sum\limits_i (y_i - \frac{1}{\phi})^2 \end{align*}
To get $\widehat\phi$, we set the derivative of the above expression equal to 0, giving us: \begin{align*} \sum\limits_i 2(y_i - \frac{1}{\widehat\phi}) &= 0\\ \sum\limits_i y_i &= \sum\limits_i \frac{1}{\widehat\phi}\\ \sum\limits_i y_i &= \frac{n}{\widehat\phi}\\ \widehat\phi &=\frac{n}{\sum\limits_i y_i} \end{align*}
Part 2: Is MLE Biased? Yes, it is biased. To see this we need to invoke the gamma distribution, as we want to take $n$ exponential distributions and sum them (note that the exponential is a special case of gamma distribution with first parameter $1$.).
Our steps will be as follows: we will first assume a parameter $\lambda$ and generate m samples. The sum of these $\sum_i X_i$ would follow a gamma distribution. Since the MLE estimate is $\frac{1}{\sum_i X_i}$, we need the distribution of $\frac{1}{\sum_i X_i}$. This follows the inverse gamma distribution. We take the expectation of this inverse gamma distribution and compare with the lambda we started with, for some fixed n number of samples.
Let us first assume a parameter $\lambda$ to generate the samples, and see if MLE returns to us lambda after n samples. The distributions are: \begin{array}{c|c} Random Variable & Distribution\\ \hline X_i & exp(\lambda) = Gamma(1,\lambda)\\ \hline \sum\limits_{i=1}^nX_i & Gamma(n,\lambda)\\ \hline \frac{1}{\sum\limits_{i=1}^nX_i} & InvGamma(n,\lambda) \end{array} Now expectation of the inverse gamma$(n,\lambda)$ distribution is $\frac{\lambda}{n-1}$. Therefore taking expectation on both sides, \begin{align*} \mathbb E \left[\frac{1}{\sum\limits_{i=1}^n X_i}\right] &= \mathbb E \left[InvGamma(n,\lambda)\right] = \frac{\lambda}{n-1}\\ \frac{1}{n} \mathbb E \left[\frac{n}{\sum\limits_{i=1}^n X_i}\right] &= \frac{\lambda}{n-1}\\ \end{align*} But $\mathbb E \left[\frac{n}{\sum\limits_{i=1}^n X_i}\right] = \mathbb E [\widehat{\phi}]$. Therefore, \begin{align*} \frac{1}{n} \mathbb E [\widehat{\phi}] &= \frac{\lambda}{n-1}\\ \mathbb E [\widehat{\phi}] &= \frac{n\lambda}{n-1}\\&\neq \lambda\\ \end{align*}
Therefore $\widehat{\phi}$ is not an unbiased estimator of $\lambda$