I'm struggling a lot with below problem:
Let's take a sequence of independent two-dimmensional vectors of random variables $(A_n, B_n)_{n=1}^{\infty}$, where all vectors are uniformly distributed on square $[-2,2] \times [-2,2]$. Let $V_n=(S_n, T_n) = (\sum_{i=1}^n A_i, \sum_{i=1}^n B_i)$ and $|V_n| = \sqrt{(S_n)^2+(T_n)^2}$. Determine constant $c$ so that $lim_{n \to \infty} P(|V_n|<c\sqrt{n})=0,95$.
Any help is much appreciated.
Using the fact that $S_n\ and\ T_n$ are independent and approximately normal, you can write the joint density function approximately as a product of two normal densities. Convert to polar coordinates and the function of r (after integrating out the angle) is the approximate density function for $|V_n|$.