Biased MLE estimate of mean (expectation)

71 Views Asked by At

Please give an example of p.m.f. or p.d.f. , the maximum likely-hood estimate of whose mean (expectation) is a biased estimator . Thanks

2

There are 2 best solutions below

0
On

Hint: consider $$f_u(x) = \begin{cases} \frac 1u & \text{ if } 0< x < u \\ 0 & \text{ otherwise. } \end{cases} $$

$$ \hat u_{ml} = \max X_k \\ \max X_k < u \implies Eu_{ml} < u $$

formal proof: $$ E\hat u_{ml} = \int_0^u P(\max X_k > x) dx = \int_0^u [1- P(\max X_k \le x)] dx \\ = \int_0^u [1- P(X \le x)^n] dx = u - u \int_0^u \left(\frac xu \right)^n \frac{dx}u = u\left( 1-\frac 1{n+1} \right) < u $$

2
On

One of the most classical examples might be the following: assume that $(X_i)_{1\leqslant i\leqslant n}$ is i.i.d. exponential with parameter $\lambda$, then the likelihood is $$\ell(x_1,\ldots,x_n)=\lambda^n\mathrm e^{-\lambda(x_1+\cdots+x_n)},$$ hence the MLE for $\lambda$ is $$\hat\lambda=\frac{n}{X_1+\cdots+X_n}.$$ For every $i$, $E(X_i)=1/\lambda$, hence, by convexity, $$E(\hat\lambda)\gt\frac{n}{E(X_1)+\cdots+E(X_n)}=\lambda.$$ An explicit computation yields, for every $n\geqslant2$, $$E(\hat\lambda)=\frac{n}{n-1}\lambda.$$