Estimator of a Random Variable

65 Views Asked by At

Given a random varable $Y$ where

$$ f_Y(y) = \begin{cases}e^{-(y-k)} \quad x>k\\0\quad \text{otherwise}\end{cases} $$ Given $n$ observations of $Y$. Is the sample mean $\bar{Y}$ an unbiased estimator of $k$? Is it a consistent estimator of $k$?

I have thought about it this way:

$$ \begin{align} \mathbb E(\bar{Y}) &= \mathbb E\begin{bmatrix}\frac 1n \sum_i y_i\end{bmatrix}\\ &=\frac 1n \sum_i \mathbb E (y_i) \end{align} $$ Consider $$ \begin{align} \mathbb E(y_i) &= \int_k^\infty y e^{-(y-k)} dy\\ &=e^k\int_k^\infty y e^{-y} dy\\ &=-e^k[ye^{-y}+e^{-y}]^\infty_k\\ &=e^k(ke^k+e^{-k})\\ &= e^{2k}(k+1) \end{align} $$ and continuing $$ \begin{align} \mathbb E(\bar{Y}) &=\frac 1n \sum_i \mathbb E (y_i)\\ &=\frac 1n (e^{2k})(k+1)\cdot n &= (e^{2k})(k+1) \neq k \end{align} $$

and hence conclude that it is not a good estimator of $k$. However, since the value converges to a value as $n \rightarrow \infty$, we can say it is a consistent estimator.

Update

As pointed out, $\mathbb E(y_i)=k+1$ and so $\mathbb E (\bar Y)=k+1$

1

There are 1 best solutions below

4
On BEST ANSWER

You made a sign error calculating expectation of $y_i$. It should end up being k+1. Notice this is exponential distribution shifted to the right by k.

It is obviously not consistent as it does not converge to k.

You can always subtract 1 from the mean to get an unbiased estimator, but this may be bigger than the minimum of your data, making it useless. The maximum likelihood estimator is minimum of $y_i$, if you are inerested.

It is consistent but biased. Write down the likelihood function and try it. Note the likelihood function should be written as

$e^{k-y}1_{y>k}$

enter image description here