Let {$X_n, n\geq1$} be a sequence of i.d.d random variables. Suppose that $X_1$ follows a uniform distribution on (-1, 1).
Let $$ \begin{gather} Y_n = \frac{\sum_{i=1}^{n} X_i}{\sum_{i=1}^{n}X_i^2 + \sum_{i=1}^{n}X_i^3}\end{gather}$$
Show that $\sqrt{n}Y_n$ converges to a random variable Y in distribution. Also find the distribution of $Y$.
I think this problem has to do with the central limit theorem somehow. However, I can't put my finger on it specifically. I tried to break $Y_n$ into density functions, but didn't turn out useful. Any hint would be appreciated.
Yes, CLT along with Slutsky's Theorem. In fact, as already shown,
$$\sqrt{n}Y_n=A_nB_n$$
where
$$A_n\xrightarrow{\mathcal{P}}3$$
and
$$B_n\xrightarrow{\mathcal{L}}B\sim N\left(0;\frac{1}{3} \right)$$
thus
$$\sqrt{n}Y_n\xrightarrow{\mathcal{L}}3B\sim N(0;3)$$
in the denominator you can apply SLLN thus
$$\frac{1}{n}\Sigma_iX_i^2\xrightarrow{a.s.}E(X^2)=1/3$$
converging almost surely, it converges also in probability.
As the second addend in the denominator is concerned, using the same reasoning, it converges almost surely to zero, thus using Continuous Mapping theorem, your denominator converges almost surely to $1/3$