Suppose $\{X_n,n\ge1\}$ are independent random variables, satisfying $EX_n=0$, $DX_n=\sigma_n^2<\infty$. Set $S_n=\sum_{i=1}^nX_i$ and $s_n^2=\sum_{i=1}^n\sigma_i^2$, assume $$\frac{S_n}{s_n}\overset{d}\to N(0,1)~~~ \text{and}~~~ \frac{\sigma_n}{s_n}\to \rho.$$ Show that $$\frac{X_n}{s_n}\overset{d}\to N(0,\rho^2).$$ (Hint: One can use the fact that if $X$, $Y$ are independent and $X+Y$ is normally distributed, then each of X and Y is normally distributed.)
Since the condition is asymptotically normal distributed, how can we use the hint properly? Can we say that since $(S_{n-1}+X_n)/s_n\overset{d}\to N(0,1)$ and $S_{n-1}$ is independent with $X_n$, $\frac{X_n}{s_n}$ converges to a normal distribution?
I have a proof, but not using your hint.
Write $$\frac{S_n}{s_n}=\frac{X_n}{s_n}+\frac{S_{n-1}}{s_n}.$$ We compute the limit in law of $S_n/s_n$ and $S_{n-1}/s_n$. By hypothesis, $$\frac{S_n}{s_n}\stackrel{L}{\longrightarrow} N(0,1).$$ Write $$\frac{S_{n-1}}{s_n}=\frac{S_{n-1}}{s_{n-1}}\frac{s_{n-1}}{s_n}.$$ Notice that $$\frac{s_n^2}{s_{n-1}^2}=\frac{s_{n-1}^2+\sigma_n^2}{s_{n-1}^2}=1+\frac{s_n^2}{s_{n-1}^2}\frac{\sigma_n^2}{s_n^2}\Rightarrow \frac{s_n^2}{s_{n-1}^2}=\frac{1}{1-\frac{\sigma_n^2}{s_n^2}}\longrightarrow \frac{1}{1-\rho^2},$$ so $$\frac{S_{n-1}}{s_n}\stackrel{L}{\longrightarrow} \sqrt{1-\rho^2}\;N(0,1)=N(0,1-\rho^2)$$ by Slutsky's Theorem.
By independence, $$\varphi_{\frac{S_n}{s_n}}(t)=\varphi_{\frac{X_n}{s_n}}(t)\varphi_{\frac{S_{n-1}}{s_n}}(t),$$where $\varphi$ is the characteristic function. We have $$\varphi_{\frac{X_n}{s_n}}(t)=\frac{\varphi_{\frac{S_n}{s_n}}(t)}{\varphi_{\frac{S_{n-1}}{s_n}}(t)}\stackrel{n\rightarrow\infty}{\longrightarrow}\frac{\text{exp}(-t^2/2)}{\text{exp}(-(1-\rho^2)t^2/2)}=e^{-\frac{\rho^2t^2}{2}}.$$ Thus, $$\frac{X_n}{s_n}\stackrel{L}{\longrightarrow}N(0,\rho^2).$$