Estimator problem. Can chebyshev's inequality be applied to a sequence?

98 Views Asked by At

Let $g: \Theta \to \mathbb R$ be a function of an unknown parameter $\theta$. We want to estimate $g$. Let $\hat g_n$ be a sequence of estimators for $g(\theta)$ such that for all $\theta \in \Theta$: $$\lim_{n\to\infty} E[\hat g_n]=g(\theta) \hspace{1cm} \text{ and} \hspace{1cm} \lim_{n\to\infty} \text{Var}(\hat g_n)=0$$

Show that the sequence $\hat g_n$is consistent.

My attempt: I was thinking of using chebyshev's inequality here but I was not sure if it can be applied to sequences like this. In order for $\hat g_n$ to be consistent I need to show that:

$$\lim_{n \to \infty} \Pr(\vert \hat g_n-g \vert > \varepsilon )=0$$

I am given that $\hat g_n$ has expected value $g$, therefore I can use chebyshevs inequality:

$$ \begin{split}&\phantom{\implies} \, \, \Pr(\vert X-E[X]\vert > \varepsilon)\le\frac{\text{Var}(X)}{\varepsilon^2} \\[10pt] & \implies \Pr(\vert \hat g_n-g\vert > \varepsilon) \le \frac{\text{Var}(\hat g_n)}{\varepsilon^2} \\[10pt] & \iff \lim_{n \to \infty}\Pr(\vert \hat g_n-g\vert > \varepsilon) \le \lim_{n \to \infty}\ \frac{\text{Var}(\hat g_n)}{\varepsilon^2}=\frac{1}{\varepsilon^2}\lim_{n\to \infty}\text{Var}(\hat g_n)=0\end{split}$$

Is this the correct way to do this or am I "not allowed" to use the chebyshev inequality here?

1

There are 1 best solutions below

2
On BEST ANSWER

You can use Chebyshev's inequality but I suspect not quite like that, as you do not know $E[\hat g_n]=g$ for any finite $n$

Instead perhaps you could say something like: for any $\varepsilon>0$ and any $\delta >0$ there is

  • an $N_1$ such that $|E[\hat g_n]-g| \lt \frac12{\varepsilon} $ for all $n> N_1$
  • an $N_2$ such that $\text{Var}(\hat g_n) \lt \frac14\varepsilon^2 \delta $ for all $n> N_2$
  • an $N_0 = \max(N_1,N_2)$ such that both are true for all $n> N_0$
  • $\Pr(\vert \hat g_n-E[\hat g_n]\vert > \frac12\varepsilon) \lt \frac{\text{Var}(\hat g_n)}{\frac14\varepsilon^2} \lt \frac{\frac14{\varepsilon^2 \delta}}{\frac14\varepsilon^2} = \delta$ for all $n> N_0$ using Chebyshev
  • $\Pr(\vert \hat g_n-g\vert > \varepsilon) \lt \delta$ for all $n> N_0$ since $\vert \hat g_n-g\vert \le \vert \hat g_n-E[\hat g_n]\vert +\vert E[\hat g_n]-g\vert$

and since $\delta$ was arbitrary, this gives you $\lim\limits_{n \to \infty} \Pr(\vert \hat g_n-g \vert > \varepsilon )=0$ and convergence in probability