Proof of Skorohod embedding

171 Views Asked by At

This statement and proof is taken from a book of Ludger Rüschendorf.

Skorohod embedding: $B$ standard brownian motion, $X_1,X_2,\ldots$ iid random variables with $E[X_1]=0$ and $E[X_1^2]<\infty$. Then there exist iid stopping times $T_1, T_2, \ldots$ with $E[T_1]=E[X_1^2]$ and $S_1, S_2,\ldots$ is a increasing sequence of stopping times with respect to the filtration $( \mathcal{F}_t)_{t\in[0,\infty)}$, whereas $ \mathcal{F}_t:=\sigma(B_t:0\le s\le t)$ and $S_n=\sum_{i=1}^n T_i$.

Additionally it holds

  1. $P^{B_{S_{n}}-B_{S_{n-1}}}=P^ {X_1}$
  2. $(B_{S_{n}}-B_{S_{n-1}})_{n\ge 1}\overset{law}{=}(X_n)_{n\ge 1}$
  3. $(B_{S_n})_{n\ge 1}\overset{law}{=}(\sum_{i=1}^n X_i)_{n\ge 1}$

Proof:

Using the Skorohod construction we find a stopping time $T_1$ with $$E[T_1]=E[X_1^2]\quad \text{and}\quad B_{T_1}\overset{law}{=}X_1$$

$(B_{t+T_1}-B_{T_1})_{t\ge 0}$ is again a brownian motion by the strong markov property of brownian motion. Again we can find a stopping time $T_2$ with

$$E[T_2]=E[X_2^2]\quad\text{and}\quad B_{T_2+T_1}-B_{T_2}\overset{law}{=}X_2$$

Since the increments of a brownian motion are independent, $\color{blue}{\text{it follows that } T_2\text{ is independent of } \mathcal{F}_{T_1}\text{ and therefore }T_2\text{ is independent of }T_1.}$ ($\mathcal{F}_{T_1}$ denotes the $\sigma$-algebra of $T_1$-past). By induction we find a sequence of stopping times $S_1,S_2,\ldots$ with

$$S_n=S_{n-1}+T_n,\quad B_{S_n}-B_{S_{n-1}}\overset{law}{=}X_n\quad\text{and}\quad E[T_n]=E[X_n^2], $$ $\color{blue}{\text{such that } T_n \text{ is independent of }T_1,\ldots, T_{n-1}\text{ and } B_{S_n}-B_{S_{n-1}}\text{ is independent of }\mathcal{F}_{T_{n-1}}.}$ From that we find $$(B_{S_{n}}-B_{S_{n-1}})_{n\ge 1}\overset{law}{=}(X_n)_{n\ge 1}$$ $\color{blue}{\text{Therefore}}$

$$\color{blue}{(B_{S_n})_{n\ge 1}\overset{law}{=}(\sum_{i=1}^n X_i)_{n\ge 1}}\quad\text{and}\quad E[S_n]=\sum_{i=1}^nE[X_i^2]$$

First blue sentence: How can I show exactly that $T_2$ and $T_1$ are independent? I know that from the strong markov property it also follows, that $(B_{t+T_1}-B_{T_1})_{t\ge 0}$ is independent of $\mathcal{F}_{T_1}$. Then I was thinking that $B_{T_2}\overset{law}{=}B_{T_2+T_1}-B_{T_1}$ must be independent of $\mathcal{F}_{T_1}$, too. But now I neither understand how I can use this for proving that $T_2$ is independent of $\mathcal{F}_{T_1}$ nor how $T_2$ is independent of $T_1$.

Second blue sentence: I see that again the strong markov property of brownian motion is used, such that

$$B_{T_n+(T_{n-1}+\ldots+ T_1)}-B_{T_{n-1}+\ldots+ T_1}$$ is independent of $\mathcal{F}_{T_{n-1}+\ldots+ T_1}$. Again as in the first blue sentence $T_n$ should be independent of $T_{n-1}+\ldots+ T_1$, but does this already imply, that $T_n$ is independent of each $T_{n-1},\ldots, T_1$?

Third blue sentence: My idea would be using $$f:\mathbb{R}^\mathbb{N}\rightarrow \mathbb{R}^\mathbb{N},\quad f(x_1,x_2,x_3,\ldots)=(x_1,x_2+x_1,x_3+x_2+x_2,\ldots),$$ such that

$$ (B_{S_n})_{n\ge 1}=f\big( (B_{S_{n}}-B_{S_{n-1}})_{n\ge 1}\big)\overset{law}{=}f\big((X_n)_{n\ge 1}\big) =(\sum_{i=1}^n X_i)_{n\ge 1}$$

But is this really possible here? Is $f$ a well defined function here?

1

There are 1 best solutions below

0
On BEST ANSWER

By the Skorohod construction, there is a stopping time $T_1$ with respect to $\big(\sigma(B_s:0\le s\le t)\big)_{t\ge 0}$ with $$E[T_1]=E[X_1^2]\quad \text{and}\quad B_{T_1}\overset{law}{=}X_1$$

By the strong markov property, $(B_{t+T_1}-B_{T_1})_{t\ge 0}$ is again a Brownian motion and $$\sigma (B_{s+T_1}-B_{T_1}:s\ge 0)\text{ is independent of }\mathcal{F}_{T_1}$$

By the Skorohod construction, there is a stopping time $T_2$ with respect to $\big(\sigma(B_{s+T_1}-B_{T_1}:0\le s\le t\big))_{t\ge 0}$ with

$$E[T_2]=E[X_2^2]\quad\text{and}\quad B_{T_2+T_1}-B_{T_2}\overset{law}{=}X_2$$

By definition $T_2$ is $\sigma (B_{s+T_1}-B_{T_1}:s\ge 0)$-measurable and therefore independent of $\mathcal{F}_{T_1}$. Since $T_1$ is $\mathcal{F}_{T_1}$-measurable, we find that from the property above, that $T_2$ is independent of $T_1$.

By induction, there are stopping times $T_n$ independent of $\sum_{i=1}^{n-1}T_i$. Since $T_i\ge 0$ for all $i\ge 1$, $\sum_{i=1}^{n-1}T_i$ is in fact a stopping time. Using $\mathcal{F}_{T_1},\ldots, \mathcal{F}_{T_n}\subset\mathcal{F}_{T_1+\ldots+T_n}$, $T_n$ is independent of each $T_i$ with $0\le i\le n-1$.

By that, we have found a sequence of stopping times $S_1,S_2,\ldots$ with

$$S_n=S_{n-1}+T_n,\quad B_{S_n}-B_{S_{n-1}}\overset{law}{=}X_n\quad\text{and}\quad E[T_n]=E[X_n^2], $$ such that $T_n$ is independent of $T_1,\ldots, T_{n-1}$ and $B_{S_n}-B_{S_{n-1}}$ is independent of $\mathcal{F}_{T_{n-1}}$. Note that any $S_n$ is stopping time with respect to $\big(\sigma(B_s:0\le s\le t)\big)_{t\ge 0}$, since $T_1$ is stopping time with respect to $\big(\sigma(B_s:0\le s\le t)\big)_{t\ge 0}$ and by this notation we find that $\{S_n\le t\}\in \mathcal{F}_t$, since all $T_i\ge 0$.

Therefore $$(B_{S_{n}}-B_{S_{n-1}})_{n\ge 1}\overset{law}{=}(X_n)_{n\ge 1}$$ and

$$ (B_{S_n})_{n\ge 1}=f\big( (B_{S_{n}}-B_{S_{n-1}})_{n\ge 1}\big)\overset{law}{=}f\big((X_n)_{n\ge 1}\big) =(\sum_{i=1}^n X_i)_{n\ge 1}$$ for $f:\mathbb{R}^\mathbb{N}\rightarrow \mathbb{R}^\mathbb{N}$, $f(x_1,x_2,x_3,\ldots)=(x_1,x_2+x_1,x_3+x_2+x_2,\ldots)$.

At least $E[S_n]=\sum_{i=1}^nE[X_i^2]$ holds by linearity.