Lindeberg condition implies L1 convergence of cond. variances

158 Views Asked by At

The following is taken from Dvoretzky, 1972, ASYMPTOTIC NORMALITY FOR SUMS OF DEPENDENT RANDOM VARIABLES, Equation 4.6.

$$\{X_{n,k}\}_{n=0,1,...;k=0,1...,k_n}$$ is a (triangular) array of r.v.'s /w

$$E[X_{n,k}|\mathcal{F}_{n,k-1}]=0$$ $$\sum_{k=1}^{k_n}E[X_{n,k}^2|\mathcal{F}_{n,k-1}]=1$$ $$\lim_n\sum_{k=1}^{k_n}E[X_{n,k}^21\{|X_{n,k}|>\epsilon\}]=0 \; \forall \epsilon>0 $$ Where the last condition implies that $\lim_n\sum_{k=1}^{k_n}P\{|X_{n,k}|>\epsilon\}=0 \; \forall \epsilon>0$

My goal is to show that then $$E[\max_{1\leq k\leq k_n}E[X_{n,k}^2|\mathcal{F}_{n,k-1}]] \rightarrow0 \textrm{ as } n\rightarrow \infty$$ which is equialent to showing that $$\sum_{k=0}^{k_n} P(E[X_{n,k}^2|\mathcal{F}_{n,k-1}]> \epsilon) \rightarrow 0$$ (right?). The unconditional Lindeberg doesn't imply this (or does it?).

I will just in a very short form state what my first attempt was and where my problem is. For $$E[\max_{1\leq k\leq k_n}E[X_{n,k}^2|\mathcal{F}_{n,k-1}]] \\ \leq \epsilon + \sum_{k=1}^{k_n} E[E[X_{n,k}^2|\mathcal{F}_{n,k-1}]1\{|X_{n,k}|>\epsilon\}]$$

my problem here is: How can I use the Lindeberg condition, as the additional set in not measurable w.r.t. $\mathcal{F}_{n,k-1}$.I tried playing around with the tower-property a bit but can't make it work. Maybe I need to use some Lemma? A way to show what I'm aiming for would be to show convergence in probability and uniform integrability of the $\max_{1\leq k\leq k_n}E[X_{n,k}^2|\mathcal{F}_{n,k-1}]$ for $n \in N$ , but that would be a lot harder (if it actually is implied) and I want to be sure I'm not missing something. I'm just confused because it sounds so trivial in the paper... Thanks in advance!

1

There are 1 best solutions below

2
On BEST ANSWER

I do not see how the original idea can lead to the answer, but a small modification can help. Write $$ \mathbb E\left[X_{n,k}^2\mid \mathcal F_{n,k-1}\right] =\mathbb E\left[X_{n,k}^2\mathbf 1\left\{ \left\lvert X_{n,k}\right\rvert\leqslant \varepsilon\right\} \mid \mathcal F_{n,k-1}\right]+\mathbb E\left[X_{n,k}^2\mathbf 1\left\{ \left\lvert X_{n,k}\right\rvert\gt \varepsilon\right\} \mid \mathcal F_{n,k-1}\right]. $$ The first term of the right hand side is smaller than $\varepsilon^2$ hence $$ \mathbb E\left[X_{n,k}^2\mid \mathcal F_{n,k-1}\right] \leqslant \varepsilon^2+\mathbb E\left[X_{n,k}^2\mathbf 1\left\{ \left\lvert X_{n,k}\right\rvert\gt \varepsilon\right\} \mid \mathcal F_{n,k-1}\right]. $$ Taking the maximum over $k$, we get $$ \max_{1\leqslant k\leqslant k_n}\mathbb E\left[X_{n,k}^2\mid \mathcal F_{n,k-1}\right] \leqslant \varepsilon^2+\max_{1\leqslant k\leqslant k_n}\mathbb E\left[X_{n,k}^2\mathbf 1\left\{ \left\lvert X_{n,k}\right\rvert\gt \varepsilon\right\} \mid \mathcal F_{n,k-1}\right]. $$ Bounding the second term by $\sum_{k=1}^{k_n}\mathbb E\left[X_{n,k}^2\mathbf 1\left\{ \left\lvert X_{n,k}\right\rvert\gt \varepsilon\right\} \mid \mathcal F_{n,k-1}\right]$ and taking the expectation yields $$ \mathbb E\left[\max_{1\leqslant k\leqslant k_n}\mathbb E\left[X_{n,k}^2\mid \mathcal F_{n,k-1}\right]\right]\leqslant \varepsilon^2+\sum_{k=1}^{k_n}\mathbb E\left[X_{n,k}^2\mathbf 1\left\{ \left\lvert X_{n,k}\right\rvert\gt \varepsilon\right\} \right]. $$