Let $(a_i)_{i\ge1}$ be a bounded positive sequence and $X_i$ be iid random variables with mean $0$ and finite variance. Let $s_n=\frac{\sum_{i=1}^n a_i X_i}{\sqrt{\sum_{i=1}^n a_i}}$.
If $a_i=1$ for all $i$, then we have the central limit theorem for the limit. Is there anything known for the general case? Do we need to restrict $a_i$'s more to have a similar limit?
PS. I don't use the word converge because I don't want to guess what kind of convergence this may be.
I think the convergence depends on how we choose $a_i$.
Say $a_i=A_i$ is a sequence of independent, identically distributed (i.i.d.) random variables which are also independent from $X_i$. Then, the random variables $Y_i=A_i X_i$ are a also an i.i.d. sequence (not independent of $A_i$ or $X_i$, but independent among themselves). Because of the independence between $A_i$ and $X_i$ we can factor out the moments: $$ E(Y^k)=E(A^k)E(X^k), $$ so we compute $$ E(Y)=E(A)E(X)=0, $$ as long as the mean of $A$ is finite, and $$ V(Y)=E(Y^2)-E(Y)^2=E(A^2)E(X^2) =(V(A)+E(A)^2)(V(X)), $$ which is finite if we assume $V(A)$ is finite as well. Now, CLT applies: $$ \sqrt{n}\,(S_Y) \xrightarrow[]{d} N(0,V(Y)), $$ since $E(Y)=0$, for $S_Y$ the sample mean of $Y$: $$ S_Y=\frac{1}{n}\sum_{i=1}^n Y_i =\frac{1}{n}\sum_{i=1}^n A_iX_i. $$
Say now $a_i$ are given instead by a measurable function of $X$, $a_i=a(X_i)$. Now we define $Z_i=a(X_i)X_i$ which are also i.i.d., and repeat the analysis. We cannot factor the moments anymore because $a(X_i)$ and $X_i$ are not independent, but if $E(Z)$ and $V(Z)$ are finite then we get the CLT limit again, this time $$ \sqrt{n}\,(S_Z-E(Z)) \xrightarrow[]{d} N(0,V(Z)). $$
What about the case of $a_i$ given by a known sequence? Then $W_i=a_iX_i$ are a sequence of independent random variables, but their distributions are no longer identical. There is a weaker version of the CLT which gives $$ \frac{1}{\gamma_n} \sum_{i=1}^n (W_i-E(W_i)) \xrightarrow[]{d} N(0,1), $$ where $$ \gamma_n^2=\sum_{i=1}^n V(W_i), $$ provided the quantity $$ \frac{1}{\gamma_n^{2+\delta}}\sum_{i=1}^n E[|X_i-E(X_i)|^{2+\delta}] $$ goes to zero as $n$ goes to infinity for some positive $\delta$.
I suspect the last case is the one you had in mind, I hope this helps.