The central limit theorem is, as far as I know, a statement about the probability distribution of the average ${\sum (X_i)_n\ /\ n}$ of $n$ iid random variables $(X_i)_n$, where $(X_i)_n:=(X_1, \ldots, X_n)$ and $\sum (X_i)_n:=X_1+\cdots+X_n$.
Both the average ${\sum (X_i)_n\ /\ n}$ and the sum ${\sum (X_i)_n}$ of $n$ iid random variables will converge in distribution to a normal distribution $\mathcal{N}$.
More precisely, let $(X_i)_n$ be a sequence of $n$ iid random variables, let $D[\mu,\nu]$ be a probability distribution with mean $\mu$ and variance $\nu$, let $\sum (X_i)_n$ be the sum of all $X_i$, let $\textbf{A}(X_i)_n$ be the average of all $X_i$ (where the average of an $n$-tuple is the sum divided by $n$). The following hold (I think):
- If $X_i \sim D[\mu_i,\nu_i]$, then $\sum(X_i)_n \sim \mathcal{N}[\sum(\mu_i)_n, \sum(\nu_i)_n]$
- If $X_i \sim D[\mu_i,\nu_i]$, then $\textbf{A}(X_i)_n \sim \mathcal{N}[\textbf{A}(\mu_i)_n, \textbf{A}(\nu_i)_n]$
where $\mathcal{N}[\mu,\nu]$ stands for a normal distribution with mean $\mu$ and variance $\nu$.
Assuming these results are true, then both have a "central limit flavor" to them. So why not refer to either of them as "the" central limit theorem?
In any case, the central limit theorem seems to be a result about linear operators ($\sum$ and $\textbf{A}$, in this case) acting on vectors of random variables.
- Why is the central limit theorem not usually seen as a result about sums? Sums and averages seem like almost the same to me, but sums are simpler.
- More importantly, is there a general theory that studies (linear, or otherwise) operators acting on vectors of random variables?
Suppose $X_1,X_2,X_3,\ldots$ are an i.i.d. sequence with expected value $\mu$ and variance $\sigma^2<\infty.$ Let $\overline{X}_n = (X_1+\cdots+X_n)/n.$ $$ \lim_{n\to\infty} \Pr\left( \frac{\overline X_n - \mu} {\sigma/\sqrt n} \in A \right) = \int_A \varphi(x)\,dx $$ where $\varphi$ is the standard normal density function. That is the central limit theorem. Note that