Rate of convergence precise meaning/definition

70 Views Asked by At

In the context of mathematical statistics, people often say MLE's rate of convergence is $\sqrt{n}$ as $\sqrt{n}(\hat{\theta}_n-\theta_0) \to N(0,V)$ in distribution. But is there any official definition of rate of convergence--is it simply the $f(n)$ that you multiply to the diminishing sequence of random variable (meaning that it goes to 0 in probability) so that the product does not degenerate or blow up (to infinity)? Is there a proof that such $f(n)$ is unique up to multiplication/addition of a constant?

1

There are 1 best solutions below

0
On BEST ANSWER

Suppose $f(n) X_n$ converges in distribution.

If $g(n)/f(n) \to 0$ then $$g(n) X_n = \frac{g(n)}{f(n)} f(n) X_n$$ converges in distribution to $0$ (e.g., by Slutsky's theorem). Similarly if $g(n) / f(n) \to \infty$ then I think you can show $g(n) X_n$ blows up and does not converge in distribution, although I may be wrong about this.

I do not know if there is or isn't a formally agreed-upon definition of "rate of convergence."