I have seen the central limit theorem explained in two ways:
- If $X_{1}, \dots , X_{n}$ are i.i.d. random variables, then their sum $Y$, when standardized, will converge in distribution to the standard normal distribution as $n$ goes to infinity.
- If $X_{1}, \dots , X_{n}$ are i.i.d. random variables, then their arithmetic mean $\overline{Y}$, when standardized, will converge in distribution to the standard normal distribution as $n$ goes to infinity.
Which is the correct interpretation, as the sum of idd rvs or as the arithmetic mean? Or are they both variants of some overarching CLT?
Both are true.
The key word here is 'standardized'. Both $Y$ and $\bar{Y}$ are standardized using different means and variances so that the limiting distribution is identical.
ie. Suppose $X_i$ are i.i.d with mean $\mu$ and variance $\sigma^2$.
Then it follows that $E[Y] = E[\sum X_i] = \sum E[X_i] = n\mu$
Likewise, $E[\bar{Y}] = \frac{1}{n}E[Y] = \mu$
Doing the same with variance and using $Var(cX) = c^2 Var(X)$ it is straightforward to see that:
$Var(Y) = n\sigma^2$ and $Var(\bar{Y}) = \frac{\sigma^2}{n}$
Hence when we standardize we get:
$$Y: Z = \frac{Y - n\mu}{\sqrt n\sigma} = \frac{Y/\sqrt n-\sqrt n \mu}{\sigma}$$
$$\bar{Y}: Z = \frac{\bar{Y} - \mu}{\sigma/\sqrt n} = \frac{Y/n - \mu}{\sigma/\sqrt n} = \frac{Y/\sqrt n - \sqrt n\mu}{\sigma}$$
So that standardizing either gives the same result, $Z$.