Converse of a version of the central limit theorem

87 Views Asked by At

Let $(X_n)_n$ be a sequence of independent and identically distributed random variables. Let $\gamma>0,$ $$ Y_n=\frac{1}{n^{\gamma+\frac{1}{2}}}\sum_{k=1}^nk^{\gamma}X_k.$$

  1. Suppose that there exists a sequence $(x_n)_n$ of real numbers such that $(Y_n-x_n)_n$ converges in distribution.

    Prove that $X_1 \in L^2.$ In this case, find the limit distribution, and show that $$\limsup_n(Y_n-x_n)=- \liminf_n(Y_n-x_n)=+\infty \ a.s$$

  2. Prove that the following statements are equivalent:

    a. $X_1$ is constant a.s

    b. there exists a sequence $(x_n)_n$ such that $(Y_n-x_n)_n$ converges in probability.

    c. there exists a sequence $(x_n)_n$ such that $(Y_n)_n$ converges a.s.

For 1), the first part will be dealt later, concerning the limit distribution, since $X_1 \in L^2,$ implies, using characteristic functions, that the limit is $N(0,\frac{\sigma^2}{2\gamma+1}),$ also {$\limsup_nY_n=+\infty$} is a tail event, we conclude that $\limsup_nY_n=+\infty$ a.s.

For 2) $a. \implies c. \implies b.$ it remains to prove $b. \implies a.$ which can be deduced from 1).

So for the first part 1) in, the case $\gamma=0$ can be solved using this way: Central limit theorem and integrability.

Do you have suggestions for the case $\gamma>0?$