Difference between "large deviation estimate" and "moderate deviation estimate" in probability theory

1.5k Views Asked by At

I am from physics background. Recently, I am reading a book on "limit theorems in probability theory".

My question is,

What are the fundamental differences between "large deviation estimate" and "moderate deviation estimate" in probability theory?

Or in other words I want to know why "large deviation estimate" is called large and "moderate deviation estimate" is called moderate.

1

There are 1 best solutions below

0
On BEST ANSWER

Let $S_n$ be the sum of $n$ i.i.d. quantities, with mean $n\mu$ and variance $n\sigma^2$, let $T_n = (S_n-n\mu)/\sigma \sqrt n$. In the classical central limit theory we study the probability of events of the form $[T_n > a]$ for constant $a$. In large deviations we study events of form $[T_n> a \sqrt n]$. In moderate deviations we study events of form $[T_n > a_n]$ where $a_n\to\infty$ but $a_n = o(\sqrt n).$

There is something called Cramer's Theorem, which gives an asymptotic formula covering all these cases, in terms of an intricate auxiliary so-called $\lambda$ function. In the classical theory all the detail in the $\lambda$ function disappears in a Taylor expansion. In the moderate deviations case some terms of the Taylor expansion of $\lambda$ survive, depending on the growth rate of $a_n$. And in the large deviations case, no terms are ignorable, and one looses some control over the quality of the asymptotics. The complexity of the $\lambda$ function is the source of the division into moderate and large.