How can I prove these two questions without using the following theorem?

654 Views Asked by At

Question 1: Let $X_1, X_2, \cdots$ be independent random variables such that $$P(X_n=-n^{\theta})=P(X_n=n^{\theta})=\frac{1}{2}.$$ If $\theta > -\frac{1}{2}$ prove that the Lyapunov condition works and the sequence satisfies the central limit theorem.

Question 2: Let $X_1, X_2, \cdots$ be independent random variables such that $$P(X_n=-n^{\theta})=P(X_n=n^{\theta})=\frac{1}{6n^{2(\theta -1)}}\quad \text{and} \quad P(X_n=0)=1-\frac{1}{3n^{2(\theta -1)}}.$$ If $1< \theta < \frac{3}{2}$ prove that the Lindeberg condition works works and the sequence satisfies the central limit theorem.

THEOREM: Let be $\lambda >0$, then $$\frac{1}{n^{\lambda +1}}\displaystyle\sum_{k=1}^{n} k^{\lambda}\underset{n\to +\infty}{\longrightarrow} \frac{1}{\lambda+1},$$ in such a way that $\displaystyle\sum_{k=1}^{n} k^{\lambda}$ has the order $\mathcal{O}= n^{\lambda + 1}$.

Solution of the question 1: $EX_n=0\; \forall n\in \mathbb{N}$, $Var(X_n)=EX_n^2=n^{2\theta}$ and $\displaystyle\sum_{k=1}^{n} Var X_k = \displaystyle\sum_{k=1}^{n}k^{2\theta}$ is (by the theorem above) $\mathcal{O}(n^{2\theta +1})$. Also $S_n=\left(\sum_{k=1}^{n}Var(X_k)\right)^{\frac{1}{2}}$ is (using the theorem above) $\mathcal{O}=\left(n^{(2\theta+1)/2}\right)$. By the Lyapunov condition there exists $\delta >0$, such that \begin{align*} \lim_{n\to +\infty} \frac{1}{s_n^{2+\delta}} \displaystyle\sum_{k=1}^{n} E|X_k|^{2+\delta} &=lim_{n\to +\infty} \frac{1}{2^{2+\delta}}\displaystyle\sum_{k=1}^{n} K^{(2+\delta)\theta}\\ &=\lim_{n\to +\infty} \frac{\mathcal{O}\left(n^{(2+\delta)\theta+1}\right)}{\mathcal{O}\left(n^{(2+\delta)(2\theta +1)/2}\right)}\\ &=\lim_{n\to +\infty} \frac{\mathcal{O}(n)}{\mathcal{O}(n^{(2+\delta)/2})}\\ &=0\quad \text{for}\; \delta=2 \end{align*} Thus, the Lyapunov condition is satisfied $\forall \alpha \in \mathbb{R}$ and $\delta =2$.

Therefore, $$\frac{\displaystyle\sum_{k=1}^{n} X_k - E\sum_{k=1}^{n} X_k}{\sqrt{\displaystyle\sum_{k=1}^{n}Var(X_k)}}\overset{D}{\longrightarrow} \mathcal{N}(0,1).$$ The convergence above means that converge in distribution to standard normal distribution $\mathcal{N}(0,1)$.

REMARK: Notice that the question 1 is already answered, however I'm strying to prove again without use the theorem above. Can you help me with this?

2

There are 2 best solutions below

2
On BEST ANSWER

This answer is just for part a. Lyapunov's condition is that $$\lim_{n\rightarrow\infty}\frac{\sum_{i=1}^n\mathbb E\left(|X_i-\mu_i|^{2+\delta}\right)}{\left(\sum_{i=1}^n\sigma_i^2\right)^{(2+\delta)/2}}=0$$

for some $\delta>0$. For this problem we have

$$\begin{split}\frac{\sum_{i=1}^n\mathbb E\left(|X_i-\mu_i|^{2+\delta}\right)}{\left(\sum_{i=1}^n\sigma_i^2\right)^{(2+\delta)/2}}&= \frac{\sum_{i=1}^ni^{\theta(2+\delta)}}{\left(\sum_{i=1}^n i^{2\theta}\right)^{(2+\delta)/2}}\\ &=\frac{\sum_{i=1}^ni^{\theta(2+\delta)}}{\left(\sum_{i=1}^n i^{2\theta}\right)^{(2+\delta)/2}}\cdot \frac{\frac{1}{n^{\theta(2+\delta)}}}{\frac 1{n^{\theta(2+\delta)}}}\cdot\frac{\frac 1 {n^{(2+\delta)/2}}}{\frac 1 {n^{(2+\delta)/2}}}\\ &=\frac{\left[\sum_{i=1}^n\left(\frac i n\right)^{\theta(2+\delta)}\frac 1n\right]\cdot \frac 1 {n^{(2+\delta)/2-1}}}{\left[\sum_{i=1}^n\left(\frac i n\right)^{2\theta}\frac 1 n\right]^{(2+\delta)/2}}\\ &\longrightarrow\frac{\int_0^1x^{\theta(2+\delta)}dx\cdot \frac 1{n^{(2+\delta)/2-1}}}{\int_0^1x^{2\theta}dx}\end{split}$$

The integrals converge to $$\frac{2\theta+1}{\theta(2+\delta)+1}$$ if $\theta>-\frac{1}{2+\delta}$ and since $\delta>0$, $\frac1 {n^{(2+\delta)/2-1}}$ converges to $0$, so the whole expression is $0$ in the limit. Since $\delta$ was arbitrary, the Liapunov condition for CLT holds if $\theta>-\frac 12$.

1
On

This is for part b (only a proof sketch).

Lindeberg's condition is that $$\lim_{n\rightarrow\infty}\frac{\sum_{k=1}^n\mathbb E\left[(X_k-\mu_k)^2\mathbb 1_{\{|X_k-\mu_k|>\epsilon s_n\}}\right]}{s_n^2}=0$$

for all $\epsilon>0$, where $s_n^2=\sum_{k=1}^n \sigma_k^2$. For this problem, $s_n^2=\sum_{k=1}^n\frac{k^2}{3}$ and looking at the indicator part we have

$$\begin{split}|X_k-\mu_k|&>\epsilon s_n\\ |X_k|&>\epsilon\sqrt{\sum_{k=1}^n\frac{k^2}3}\end{split}$$

$|X_k|$ can take on the values $0$, in which case the indicator is $0$, and $k^{\theta}$, in fact being largest when $k=n$, in which case we have

$$\begin{split}\frac{n^{\theta}}{\sqrt{\sum_{k=1}^n\frac{k^2}3}}&>\epsilon\\ \frac{\sqrt 3n^\theta}{\sqrt{\sum_{k=1}^n\left(\frac k n\right)^2\frac 1 n}\cdot n \cdot \sqrt n}&>\epsilon\\ \frac{\sqrt 3 n^{\theta-\frac 32}}{\left(\sum_{k=1}^n\left(\frac k n\right)^2\frac 1 n\right)^{\frac 1 2}}&>\epsilon\end{split}$$

In the limit the denominator becomes

$$\sum_{k=1}^n\left(\frac k n\right)^2\frac 1 n\longrightarrow \int_0^1 x^2 dx=\frac 1 3$$

while the numerator is $0$ if $\theta<\frac 32$. Therefore the indicator is $0$ in this case, and the expectation in the numerator of Lindeberg's condition is $0$. Since $s_n^2\rightarrow\infty$, the expression as a whole is $0$ and Lindeberg's condition is satisfied. (Note: $\theta>1$ because otherwise the denominator is $0$ in the problem description/ the probabilities are not valid.)