For $i=1,2,\dots$ let $X_i$ be a random variable, independent of all $X_j$ with $i\neq j$, taking the values $\pm 1$ with probability $\frac{1}{2i^2}$ each and the value $0$ with probability $\frac{i^2 -1}{i^2}$. Then the variance of $X_i$ is $\sigma_i^2 = \frac{1}{i^2}$. Let $s_n^2 = \sum_{i=1}^n \sigma_i^2$.
I am trying to understand the convergence of the sequence $$ Y_n = \frac{1}{s_n} \sum_{i=1}^n X_i $$
Question: Does it converge in distribution and what is the limit?
The first thing that comes to mind is the central limit theorem. We can try the Lyapunov condition which (in this case) asks if for any $\delta>0$ $$ \lim_{n\to\infty} \frac{1}{s_n^{2+\delta}} \sum_{i=1}^n \mathbb{E} (|X_i|^{2+\delta}) = 0 $$ Since $|X_i| = |X_i|^{2+\delta}$ we find that the sum is equal to $s_n^2$ and since $\sum_{i=1}^n \frac{1}{i^2}$ converges as $n\to \infty$, the limit is nonzero and the Lyapunov condition does not yield the central limit theorem.
Since $s_n^2$ converges to $\sum_{i=1}^{+\infty}i^{-2}=\pi^2/6$, it suffices to determine the limit of $S_n:=\sum_{i=1}^nX_i$. Using pairwise dependence and centering, we get that for all $m\gt n$, $$\mathbb E\left\lvert S_m-S_n\right\rvert^2=\mathbb E\left\lvert\sum_{i=n+1}^mX_i\right\rvert^2=\sum_{i=n+1}^m\mathbb E\left\lvert X_i\right\rvert^2=\sum_{i=n+1}^m\frac 1{i^2}\leqslant \frac 1n$$ hence $\left(S_n\right)_{n\geqslant 1}$ is Cauchy in $\mathbb L^2$ and converges in $\mathbb L^2$ to some $S$.
If we assume that $X_n$ is independent of $(X_1,\dots,X_{n-1})$ for all $n$, then we can compute the characteristic function of $S_n$ hence that of $S$, namely, $$ \varphi_S(t)=\prod_{j=1}^{+\infty}\left(1-\frac 2{j^2}\sin^2\left(\frac t2\right)\right),\quad t\in\mathbb R. $$ I do not know whether it can be simplified.