Convergence of a series of standard normal random variables

131 Views Asked by At

I need to prove that for independent standard normal distributed $X_{n}$ and $t\in [0,1]$ the series $$ tX_{0} + \sum_{n=1}^{\infty}\sqrt{2}\frac{\sin{(n\pi t)}}{n\pi}X_{n}$$ converges and that it's limit $B_{t}$ fulfills $EB_{t} = 0$ and $E(B_{t}B_{s}) = \operatorname{min}(t,s)$

I really don't even know how to do the first part, even though it apparently shouldn't be that hard...

Thanks for your help

2

There are 2 best solutions below

0
On

This is the Fourier construction of Brownian motion. Define $$ B_t^n:=tX_0+\sum_{i=1}^n \frac{\sqrt{2}\sin(i\pi t)}{i\pi}X_i. $$ The sequence $\{B_t^n\}$ converges in $L^2$ because $$ \operatorname{Var}(B_t^{n}-B_t^{m})=\sum_{i=m\wedge n+1}^{m\vee n}\frac{2\sin^2(i\pi t)}{i^2\pi^2}\to 0 \quad\text{as }n,m\to\infty, $$ i.e., $\{B_t^n\}$ is $L^2$-Cauchy, and, thus, has an $L^2$-limit $B_t$.

For $s\ne t$, $(B_s^{n},B_t^{n})\xrightarrow{p}(B_s,B_t)$, which implies that the limit is Gaussian with mean $0$, and $\operatorname{Cov}(B_s,B_t)=s\wedge t$ because $$ \operatorname{Cov}(B_s^{n},B_t^{n})=st+\sum_{i=1}^n \frac{2\sin(i\pi s)\sin(i\pi t)}{i^2\pi^2}\to s\wedge t $$ as $n\to\infty$.

0
On

Let's start with showing that the sum converges by considering the mgf. Denote this sum as $B_t$. We know each $X_n$ has expected value $\mu = 0$ and variance $\sigma^2 = 1$. So the movement generating function $M_{X_n}(z) = \exp(\mu z + \frac{1}{2}\sigma^2z^2) = \exp(z^2/2).$ Then \begin{align*}M_{B_t}(z) = E[e^{zB_t}] &= E[e^{ztX_0 + z\sum_{n=1}^\infty \sqrt{2} \frac{\sin(n\pi t)}{n\pi}X_n}]\\&= E[e^{ztX_0} \cdot \prod_{n=1}^\infty e^{z\sqrt{2} \frac{\sin(n\pi t)}{n\pi}X_n}]\\&= E[e^{ztX_0}]\cdot \prod_{n=1}^\infty E[e^{z\sqrt{2} \frac{\sin(n\pi t)}{n\pi}X_n}], \quad\text{see note 1}\\ &= M_{X_0}(zt) \cdot \prod_{n=1}^\infty M_{X_n}(z\sqrt{2} \frac{\sin(n\pi t)}{n\pi})\\ &= \exp(z^2t^2/2) \cdot \prod_{n=1}^\infty \exp(z^2 \frac{\sin^2(n\pi t)}{n^2\pi^2})\\ &= \exp\left(z^2t^2/2 + z^2 \sum_{n=1}^\infty \frac{\sin^2(n\pi t)}{n^2 \pi^2}\right).\end{align*}

Now notice that $$\sum_{n=1}^\infty \frac{\sin^2(n\pi t)}{n^2 \pi^2} \leq \sum_{n=1}^\infty \frac{1}{n^2 \pi^2} = \frac{\pi^2}{\pi^2 6} = \frac{1}{6}$$ so \begin{align*}\exp\left(z^2t^2/2 + z^2 \sum_{n=1}^\infty \frac{\sin^2(n\pi t)}{n^2 \pi^2}\right) = \exp(z^2 t^2/2 + z^2(2ct + c^2)/2) = \exp(z^2(t+c)^2/2)\end{align*} which means $B_t$ is a normal random variable with mean $0$ and variance $(t+c)^2$. Here $c$ is chosen such that $$\sum_{n=1}^\infty \frac{\sin^2(n\pi t)}{\pi^2\pi^2} = \frac{2ct + c^2}{2}$$ which must exist because the sum converges and we can find the $c$ by quadratic formula. Note that $c$ does depend on $t$.

Thus $E[B_t] = 0$. We could also verify this after knowing the sum converges by applying linearity of expectation as follows: $$E[B_t]= tE[X_0] + \sum_{n=1}^\infty \sqrt{2}\frac{\sin^2(n \pi t)}{n\pi}E[X_n] = 0.$$

Lastly, \begin{align*} B_tB_s = stX_0^2 + \sum_{k=1}^\infty s\sqrt{2} \frac{\sin(n\pi t)}{n\pi}X_0X_k + \sum_{k=1}^\infty t\sqrt{2} \frac{\sin(n\pi s)}{n\pi}X_0X_k + \sum_{n,m \geq 1} 2\frac{\sin(n\pi t)\sin(m\pi s)}{nm\pi^2}X_nX_m\end{align*} and $E[X_i X_j] = E[X_i]E[X_j] = 0$ for $i \neq j$ so $$E[B_tB_s] = st E[X_0^2] + \sum_{n=1}^\infty 2 \frac{\sin(n\pi t)\sin(n\pi s)}{n^2\pi^2}E[X_n^2].$$ Since $1 = \mathrm{Var}[X_i] = E[X_i^2] - E[X_i^2] = E[X_i^2]$ we can simplify to \begin{align*}E[B_sB_t] &= st + \sum_{n=1}^\infty 2 \frac{\sin(n\pi t)\sin(n\pi s)}{n^2\pi^2}\\ &= st + \sum_{n=1}^\infty\frac{\cos(\pi n (s-t))\cos(\pi n (s+t))}{\pi^2 n^2},\quad \text{WLOG assume }s\geq t\\ &= st + \frac{\mathrm{Li}_2(e^{i(s-t)\pi})+ \mathrm{Li}_2(e^{-i(s-t)\pi}) - \mathrm{Li}_2(e^{i(s+t)\pi})- \mathrm{Li}_2(e^{-i(s+t)\pi})}{\pi^2n^2},\quad \mathrm{Li}_2 \text{ is the polylogarithm}\\ &= st + \frac{\frac{-(2\pi i)^2}{2!}B_2\left(\frac{s-t}{2}\right)-\frac{-(2\pi i)^2}{2!}B_2\left(\frac{s+t}{2}\right)}{\pi^2 n^2},\quad B_2 \text{ is the Bernoulli polynomial}\\ &= st +B_2\left(\frac{s-t}{2}\right) - B_2\left(\frac{s+t}{2}\right)\\ &= st - st + t\\ &= \min(s,t).\end{align*}

  1. Here we use that property that if $U,V$ are independent RVs then $f(U)$ and $g(V)$ are also independent. And if $X,Y$ are independent then $E[XY] = E[X]E[Y]$.