Motivation
Let $\{X_n\}_{n \in \mathbb{N}}$ be a sequence of i.i.d. random variables that have finite first moment. Let $S_n =\sum_{i=1}^n X_i$. We have the Law of Large Number $$ n^{-1}S_n \to \mathbb{E}[X_1] \quad \text{a.s.} $$ We can view $n^{-1}S_n$ as converging (in some sense) to $\mathbb{E}[X_1]$, which is a (degenerate) random variable that has the same first moment as $X_n$.
In the meantime, if we assume finite second moment, we also have the Central Limit Theorem, $$ n^{-\frac{1}{2}}S_n \overset{d}{\longrightarrow} N(\mathbb{E}[X_1],\text{Var}[X_1]). $$ We can view $n^{-\frac{1}{2}}S_n$ as converging to a random variable that has the same first and second moments as $X_1$.
The takeaway is, for the above two cases, we can always find: 1) a notion of probability convergence; 2) a random variable that matches the corresponding moments of $X_1$; 3) a proper exponent $\alpha$ that is put in front of $S_n$ -- that make $$ n^{-\alpha}S_n \to Y $$ hold.
Question
For the same i.i.d. sequence, if we further assume that its third moment is finite, can we get an analogous result? The form of the result is likely to be $$ n^{-\alpha}S_n \to Y, $$ where $\alpha$ is some positive number, $Y$ is a random variable with the same first three moments as $X_1$, and the concept of convergence is something that is well-defined.
Moreover, for even higher orders, do we have a general result for this analogy?
I will assume the $EX_i=0$ and $EX_i^{2}=1$. If there is a finite third (or higher) moment then there is also a finite second moment which implies $n^{-1/2}S_n$ converges in distribution to $N(0,1)$. This implies that $n^{-\alpha} S_n$ does not converge in distribution for any $\alpha <1/2$ and converges in distribution to $0$ for $\alpha >1/2$.