Checking the convergence in distribution

225 Views Asked by At

Random variables $X_1, X_2, ...$ are independent and for every $n$ the random variable $X_n$ is uniformly distributed on the interval $[-\sqrt{n}, \sqrt{n}]$. Denote $\sigma_n^2 = \sum_{k=1}^{n} \textbf{Var} X_k$. Identify the weak limit of the sequence $$ Y_n = \frac{X_1 + X_2 + \ldots + X_n}{\sigma_n} $$ So my attempts were to use Lindeberg's central limit theorem. Obviously we have $\sum_{k=1}^{n} \textbf{Var} X_k \equiv 1 \rightarrow 1$ and $\sum_{k=1}^{n} \mathbb{E} X_k \equiv 0 \rightarrow 0$, so we would expect $Y_n \rightarrow \mathcal{N}(0, 1)$ (the standard normal distribution), but I have a problem with checking the Lindeberg condition which appears to be in a very ugly form.

1

There are 1 best solutions below

2
On BEST ANSWER

Notice that $\mathbb E[X_k^2]=k\mathbb E[U^2]$, where $U$ is uniform on $[-1,1]$. Therefore, $\sigma_n^2$ is of order $Cn^2$ for some constant $C$.

In order to check Lindeberg's condition, we have to handle quantities like $$x_{n,k}:=\mathbb E\left[kU^2\chi\{|U|>\varepsilon nk^{-1/2}\}\right]$$ for $1\leqslant k\leqslant n$. If $\varepsilon nk^{-1/2}\gt 1$, then $x_{n,k}=0$.