Let $X_1,X_2,...$ be iid random variables with $E(X_1)=0$ and $E(X_1^2)=1$. I'm looking at the sequence $Y_n=n^{1-\alpha}\overline{X}_n$. I proved that for $\alpha=1/2$, $Y_n\overset{d}{\longrightarrow}N(0,1)$, and for $\alpha>1/2$, $Y_n\overset{p}{\longrightarrow}0$.
But what happens for $\alpha<1/2$? I'm guessing that $Y_n\longrightarrow\infty$, but I don't know in which sense is the divergence.
If I use Central Limit and Slutsky maybe I can prove it in the sense of distribution, but I don't know what does convergence in distribution to infinity means and if I can use Slutsky's theorem with infinity.
Fix $x$ a positive number $x$ and $\delta\gt 0$. For all $n$ such that $n^{\alpha-1/2}x\leqslant \delta$, the following inequality holds: $$ \Pr\left(\frac{1}{n^\alpha}S_n\gt x \right)\geqslant \Pr\left(\frac{1}{n^{1/2}}S_n\gt \delta \right) $$ hence doing the same reasoning with $X_i$ replaced by $-X_i$ and taking $\liminf_{n\to +\infty}$, we get $$ \liminf_{n\to +\infty}\Pr\left(\frac{1}{n^\alpha}S_n\gt x \right)\geqslant \Pr\left(N\gt \delta \right)+\Pr\left(N\lt -\delta \right) $$ and since $\delta$ is arbitrary, we get that for all $x$, $$ \lim_{n\to +\infty}\Pr\left(\frac{1}{n^\alpha}S_n\gt x \right)=1. $$ This proves that $n^\alpha /S_n$ goes to zero in distribution hence in probability.
Actually, using the law of the iterated logarithms, we get that $\limsup_{n\to +\infty}\frac{1}{n^\alpha}S_n=+\infty$ and $\liminf_{n\to +\infty}\frac{1}{n^\alpha}S_n=-\infty$.