Seeking help understanding steps in a proof, in "Convergence of probability measures by P.Billingsley".

118 Views Asked by At

To not waste anyones time: this question is directed at people who either have the mentioned book in their possesion, or have at some point read it.

I am reading through Patrick Billingsley's book: "Convergence of probability measures" (1999), and have encountered steps in the proof of Theorem 19.1 which i cant seem to follow.

The proof can be read here from google books (press on the second box from the top): http://books.google.dk/books?id=6ItqtwaWZZQC&pg=RA1-PR48&hl=da&source=gbs_toc_r&cad=7#v=onepage&q=Theorem%2019.1&f=false

1)

In the final steps of the proof we writes: By theorem 16.7 it is enough to show that $$ ||Y^n-X^n||_m=\frac{1}{\sigma\sqrt{n}}\max_{k\leq mn}|\theta_k -\theta_0| \stackrel{wk}{\to}_n 0 $$ which i think is a typo, shouldn't he make a reference to Theorem 16.1 (which is about when we have convergence in $D_\infty$ ) and thereafter say that it follows from Theorem 3.1?

2)

Anyways to prove the above convergence he writes that since $$ E\theta_0^2 < \infty \implies \sum_n P(\theta_n^2/n \geq \varepsilon)=\sum_nP(\theta_0^2 \geq n\varepsilon)<\infty \implies \theta_n^2 /n \to_n 0 \text{ P-a.s.} $$ where the last implication follows from Borel-Cantelli. But $$ \frac{1}{n}\max_{k\leq mn} \theta_n^2 \leq \left(\max_{k\leq n_0}\theta_k^2/n \right) \lor \left(\sup_{k>n_0 }\theta_k^2/k \right) $$ which implies the weak convergence in 1).

There are several things in the above which i cant seem to reproduce. Firstly i cant see how the sum converges when assuming that $E\theta_0^2 < \infty$. Secondly i dont know how to prove the inequality or how to use the former almost sure convergence in order to conclude that the RHS of the inequality converges a.s. to 0.

Any help would be appreciated.

1

There are 1 best solutions below

0
On

I was able to show the weak convergence towards 0 in 1)

Note that \begin{align*} \sup_{t\leq m} |X^n_t-Y^n_t| &= \sup_{t\leq m} \left|\frac{1}{\sigma\sqrt{n}}\sum_{k=1}^{\lfloor nt \rfloor} \xi_k- \frac{1}{\sigma\sqrt{n}}\sum_{k=1}^{\lfloor nt \rfloor} \eta_k\right| \\ &=\frac{1}{\sigma\sqrt{n}} \sup_{t\leq m} \left|\sum_{k=1}^{\lfloor nt \rfloor} \xi_k- \sum_{k=1}^{\lfloor nt \rfloor} \xi_k + \delta_k - \delta_{k-1}\right| \\ &=\frac{1}{\sigma\sqrt{n}} \sup_{t\leq m} \left|\delta_{\lfloor nt \rfloor }-\delta_0\right| \\ &=\frac{1}{\sigma\sqrt{n}} \max_{k \leq mn} \left|\delta_{k}-\delta_0\right| \\ &\leq \frac{1}{\sigma\sqrt{n}} \max_{k \leq mn} |\delta_{k}|+ \underbrace{\frac{\left|\delta_0\right|}{\sigma\sqrt{n}}}_{\stackrel{a.s}{\rightarrow}_n 0} \end{align*} so we are done if we can show that $\frac{1}{n} \max_{k \leq mn} \delta_{k}^2\to_n 0$ for all $m\in\mathbb{N}$ almost surely. Note that for $n\to\infty$ then $nm\to\infty$ and $$ \frac{1}{n} \max_{k \leq mn} =m \frac{1}{mn}\max_{k \leq mn} $$ so it suffices to show that $$ \frac{1}{n} \max_{k \leq n} \delta_{k}^2 \stackrel{a.s}{\rightarrow}_n 0. $$ We see that for any $\epsilon>0$: \begin{align*} \sum_{n=1}^\infty P( \delta_n^2 /n \geq \epsilon ) &= \sum_{n=1}^\infty P( \delta_n^2 \geq n\epsilon ) \\ \text{\{Stationarity\}}\to&= \sum_{n=1}^\infty P( \delta_0^2 \geq n\epsilon ) \\ \text{\{Draw\}}\to &\leq \int_0^\infty P( \delta_0^2 \geq t\epsilon ) dt \\ &= \int_0^\infty P( \delta_0^2/\epsilon \geq t ) dt \\ &= E(\delta_0^2/\epsilon)<\infty \end{align*} Thus by Borel-Cantelli we get that $$\frac{\delta_n^2}{n}\stackrel{a.s}{\rightarrow}_n 0.$$ Fix $\omega$ in the almost sure set, where the above converges. Fix $m\in\mathbb{N}$ and then fix $\epsilon>0$ and choose $N_1 \in\mathbb{N}$ such that $$ \frac{\delta_k^2}{k} <\epsilon \quad \quad \text{for } n \geq N_1 $$ then choose $N_2\geq N_1$ such that $$ \frac{\max_{k\leq N_1}\delta_k^2}{n} < \epsilon\quad \quad \text{for } n \geq N_2 $$ then we get that $$ \frac{\max_{k\leq n}\delta_k^2}{n} = \frac{\max_{k\leq N_1}\delta_k^2}{n}+ \underbrace{\frac{\max_{N_1+1\leq k\leq n}\delta_k^2}{n}}_{\exists k'>N_1 \, s.t. \downarrow \leq \delta_{k'}/k'} < \epsilon + \frac{\delta_{k'}}{k'} < 2\epsilon \quad \quad \text{for } n \geq N_2 $$ which shows that $\sup_{t\leq m} |X^n_t-Y^n_t|\to_n 0$ for all $m\in\mathbb{N}$ almost surely.

I was not able to show that this implies that (I can't seem to use the theorem he refereed to) $$ X^n \stackrel{w.k.}{\rightarrow}_n W $$ in $\mathcal{D}([0,\infty)$.

If anyone is able to show this I would be very grateful.