How to show consistence of a estimator

143 Views Asked by At

PDF of a random variable $X$ is, $$ \begin{equation} f\left(x|\gamma\right)= \begin{cases} e^{-(x-\gamma)} & x\geq\gamma\\ 0 & \text{otherwise.} \end{cases} \end{equation} $$

Given a sample of $n$ i.i.d observations $\mathbf{x}=$ $\left(x_{1}, \ldots, x_{n}\right)$ from this distribution,

(a) Show that the estimator $\hat{\gamma} = \overline{X} - 1$ (where $\overline{X} = \frac{1}{n}\sum_{i=1}^n X_i$) is an unbiased and consistent estimator for the given distribution.

Try:

\begin{align*} E[X_1]& \stackrel{\text{using def.}}{=} \gamma+ 1 \end{align*}

\begin{align*} \mathbb E[\hat{\gamma}] &= \mathbb E[\overline{X} - 1]\\ &= \mathbb E\left[\frac{1}{n}\sum_{i=1}^n X_i\right]-1\\ &= \frac{1}{n} \sum_{i=1}^n \mathbb E[X_i] -1 \\ &= \frac{1}{n} (n\gamma+ n) -1\\ &= \gamma \end{align*}

Is it enough to say the esimator is unbiased? I couldn't understand how to show the consistence. As the definition is,

An estimator $T_n$ of parameter $\theta$ is said to be consistent, if it converges in probability to the true value of the parameter: $$\lim_{n\to \infty }T_{n}\stackrel{p}{=}\theta.$$ i.e. if, for all $\epsilon>0$ $$\lim_{n\to \infty }P(|T_n-\theta|>\epsilon)=0$$

The definition is quite complicated (to convergence in probability). I guess chebyshev's inequality can play a role here, but couldn't link that.

Any help will be appreciated. TIA.

1

There are 1 best solutions below

2
On

You computations for the unbiasedness are correct. to show the consistency we use Chebickev Inequality.

First we observe that $\text{Var}(\hat{\gamma})=\text{Var}(\bar{X}_n-1)=\text{Var}(\bar{X}_n)=1/n$. Then (here $T_n = \hat{\gamma}$ and $\theta = \gamma$)

$$ \mathbb{P}(\mid \hat{\gamma} - \gamma \mid > \epsilon) = \mathbb{P}(\mid \hat{\gamma} - \mathbb{E}(\gamma) \mid > \epsilon) \le \frac{1}{\epsilon^2n^2} $$

which indeed tends to zero when $n \to \infty$.

Proof of Chebyshev's Inequality

Let $Y:=(X - \mathbb{E}(X))^2$ and $\epsilon > 0$. Since $Y$ is a nonnegative rv we can apply Markov to it, so

$$ \mathbb{P}(Y \ge \epsilon^2) \le \frac{\mathbb{E}(Y)}{\epsilon^2} = \frac{\mathbb{E}[(X - \mathbb{E}(X))^2]}{\epsilon^2} = \frac{\text{Var}(X)}{\epsilon^2} $$

Finally we obtain the desired inequality noticing that

$$ \mathbb{P}(Y \ge \epsilon^2) = \mathbb{P}((X - \mathbb{E}(X))^2 \ge \epsilon^2) = \mathbb{P}(|X - \mathbb{E}(X)| \ge \epsilon) $$