Sum of Independent, Mean Zero, Finite Variance Random Variables

45 Views Asked by At

Let $X_1, X_2, ... , X_n$ be a sequence of Random Variables:

  • Each has Mean Zero : $\mathbb{E}[X_i] = 0$ for all $1 \leq i \leq n$.
  • All are independent.
  • All have finite variance uniformly bounded: $\text{Var}(X_i) \leq \rho_{\max}$.
  • They are all discrete.
  • Each of their support set is a subset of $[-\rho_{\max}, + \rho_{\max}]$.

The last two points might be irrelevant. In any case, I want to analyze \begin{align} \mathbb{P}\left (\sum_{i=1}^n X_i \in [-\epsilon, 0] \right ) \,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\, (1) \end{align} where $0 < \epsilon < \rho_{\max}$.

Here's some intuition about what the answer should be:

The larger the $n$, the more spread out the variance of the sum. Therefore, $(1)$ should decay with $n$. Specifically, I am looking for $\frac{1}{\sqrt{n}}$ decay.

Also, I want the answer to uniformly hold for all sequences of random variables fitting the description. In other words, if the answer is $\frac{c}{\sqrt{n}}$, I should be able to say that for sufficiently large $n$,

\begin{align} \mathbb{P}\left (\sum_{i=1}^n X_i \in [-\epsilon, 0] \right ) = \frac{c}{\sqrt{n}} \end{align}

holds for all $X_1, X_2, ..., X_n$ sequences of zero mean, independent and finite variance RVs.

Are there any existing theorems to analyze such a sum?

Edit: the random variables are not identically distributed.