Consistent estimator of function of moments up to given order

113 Views Asked by At

Asume we the moments of a distribution exist up to order $n$. Consider the estimator $$t (x)=f(m_1(x),..., m_n(x)) $$ where $m_k(x):=1/r \sum_{i=1}^r x^k$. We assume we have $r$ samples of the distribution and that $f$ is continuous function mapping to $\mathbb{R}$.

I want to show that this estimator $t(x)$ is consistent for $f(e_1,...,e_n)$, where $e_k := E[X^k]$

I thought about using the definition. But there is an alternative. If the estimator is unbiased und has finite variance, then it is consistent.

But how can I see that? I know nothing about the function $f$. Can anybody give me a hint?

1

There are 1 best solutions below

3
On BEST ANSWER

Your approach will not work as you aren't even guaranteed that $f$ admits a finite variance. Instead, consider the following.

By the law of large numbers, each of your sample moments is consistent, i.e. $$\frac{1}{r}\sum_{j=1}^r x_r^k \to E(X^k)$$ in probability as $r \to \infty$. Thus, by the continuous mapping theorem we conclude that $$f(m_1, \ldots , m_n) \to f(e_1, \ldots , e_n)$$ in probability as $r \to \infty$.