Donsker Theorem like result for sum of Sums

160 Views Asked by At

Recently I came up with the following question. Let's say that $X_i$ are i.i.d random variables with $\mathbb{E}[X_1] = 0$ and $\mathbb{V}\text{ar}(X_1) = 1$. We know by Donkser Theorem that: $$\frac{S_n}{\sqrt{n}} \to B(1) \sim N(0,1)$$ This case is Donsker with $t=1$ and using the special fact that $\mathbb{E}[X_1] = 0$ and $\mathbb{V}\text{ar}(X_1) = 1$. Now my question is how can I find the following: $$\lim_{n \to \infty} \frac{1}{n^{3/2}}\sum_{k=1}^n S_k$$ My first idea was passing $n^{1/2}$ inside and finding: $$\lim_{n \to \infty} \frac{1}{n}\sum_{k=1}^n \frac{S_k}{n^{1/2}} = \lim_{n \to \infty} \frac{1}{n}\sum_{k=1}^n \frac{S_k}{k^{1/2}}\cdot \frac{k^{1/2}}{n^{1/2}}$$ But from here on out I have no idea how to procced. I may be wrong but it looks like I will have many Normal like distributions all scaled by a certain factor and then I take the mean of all of them. But I am lost in how I can find a more closed value for the limit. Any ideas?

2

There are 2 best solutions below

4
On BEST ANSWER

Unless I’m quite mistaken, the convergence of $n^{-1/2} S_n$ to $B(1)$ is only in distribution. So if we’re interested in the convergence of $V_n=n^{-3/2}\sum_{k=1}^n{S_k}$, it should probably work only in distribution.

Let $f(t)$ be the expected value of $e^{itX_1}$. Then the expected value $e_n(t)$ of $e^{itV_n}$ is $\prod_{k=1}^n{f\left(\frac{kt}{n^{3/2}}\right)}$.

So if $t$ is fixed and $n$ is large enough, $\ln{e_n(t)}$ is well defined and is the sum of the (well-defined) $\ln{f(kt/n^{3/2})}$ for $1 \leq k \leq n$. But as $X_1$ is $L^2$ with null expected value and variance one, $\ln{f(kt/n^{3/2})}=-k^2t^2/n^3 + o(k^2/n^3)$. It follows that for large enough $n$, $\ln{e_n(t)}=o(1)-t^2/3$, ie $e_n(t) \rightarrow e^{-t^2/3}$.

Thus $V_n$ seems to converge in distribution to a certain universal Gaussian.

0
On

As pointed out by @Mindlack, the convergence is possible only in distribution. Here are some alternative ways of computing the limit distribution:

1st Solution. For each $n \geq 1$ and $1 \leq k \leq n$, write

$$ X_{n,k} = \frac{n+1-k}{n^{3/2}} X_k \qquad \text{and} \qquad T_n = \frac{1}{n^{3/2}} \sum_{k=1}^{n} S_k = \sum_{k=1}^{n} X_{n,k} . $$

Then $\mathbf{E}[T_n] = 0$ and $\mathbf{Var}(T_n) = \frac{n(n+1)(2n+1)}{6n^3} \to \frac{1}{3} $ as $n\to\infty$. Moreover, for each $\varepsilon > 0$,

\begin{align*} \sum_{k=1}^{n} \mathbf{E}\bigl[ X_{n,k}^2 \mathbf{1}_{\{|X_{n,k}| > \varepsilon\}} \bigr] &= \sum_{k=1}^{n} \mathbf{E}\biggl[ \frac{k^2 X_1^2}{n^3} \mathbf{1}_{\{k|X_1| > \varepsilon n^{3/2}\}} \biggr] \\ &\leq \sum_{k=1}^{n} \mathbf{E}\biggl[ \frac{n^2 X_1^2}{n^3} \mathbf{1}_{\{n|X_1| > \varepsilon n^{3/2}\}} \biggr] \\ &= \mathbf{E} \bigl[ X_1^2 \mathbf{1}_{\{|X_1| > \varepsilon n^{1/2}\}} \bigr] \\ &\to 0 \quad \text{as} \quad n \to \infty. \end{align*}

So by the Lindberg CLT, $T_n$ converges in distribution to a normal distribution with mean $\lim_n \mathbf{E}[T_n] = 0$ and variance $\lim \mathbf{Var}(T_n) = \frac{1}{3}$.

2nd Solution. Writing

$$ T_n = \sum_{k=1}^{n} \frac{S_k}{\sqrt{n}} \cdot \frac{1}{n}, ​$$

an application of Donsker's invariance principle shows that this converges in distribution to

$$ \int_{0}^{1} W_t \, \mathrm{d}t, ​$$

where $(W_t)_{t\geq 0}$ is a standard Brownian motion in 1D. Being a "linear combination" of jointly normal variables, this integral is again a normal variable. Then

$$ \mathbf{E}\biggl[\int_{0}^{1} W_t \, \mathrm{d}t\biggr] = \int_{0}^{1} \mathbf{E}[W_t] \, \mathrm{d}t = 0 $$

and

$$ \mathbf{Var}\biggl(\int_{0}^{1} W_t \, \mathrm{d}t\biggr) = \int_{0}^{1} \int_{0}^{1} \mathbf{Cov}(W_s, W_t) \, \mathrm{d}s \mathrm{d}t = \int_{0}^{1} \int_{0}^{1} (s \wedge t) \, \mathrm{d}s \mathrm{d}t = \frac{1}{3}, $$

hence we have $ \int_{0}^{1} W_t \, \mathrm{d}t \sim \mathcal{N}(0, \tfrac{1}{3}) $.