Revuz and Yor's Book “Continuous Martingales and Brownian Motion” - Chapter 1 - Exercise 1.19

650 Views Asked by At

Context : This post is the second of a series of posts taking their origins from the exercises in the Revuz and Yor's Book "Continuous Martingales ans Brownian Motion". The reason for doing so is that the exercises of this book are hard sometimes very hard but still very interesting and that there is no definitive or authoritative source for the solutions. I am not alone on this project but I am starting with the first exercise of the book (which is easy and for which I am posting an answer). Lastly I would like this post (and the next ones) to get a "community wiki" status but do not have clearance for doing so at the question level so I will only do this for the answer. (first contribution here )

Exercise 1.19

  1. If $X$ is a $BM^d$, prove that for every $x\in \mathbb R^d$ with $‖x‖=1$, the process $\langle x,X_t\rangle$ is a linear $BM$.

  2. Prove that the converse is false. One may use the following example : if $B=(B^1,B^2)$ is a $BM^2$, and set both coordinate of $X$ as : $$X_t^1=B_{2t/3}^1-B_{t/3}^2$$ and $$X_t^2=B_{t/3}^1+B_{2t/3}^2.$$

1

There are 1 best solutions below

0
On

Please feel free to amend the solution to improve it or correct some mistakes or you can even propose a new one of your own.

Solution to Exercise 1.19 :

1 . Almost sure continuity results from the composition of the continuous mapping $<x,.>$ with the a.s. continuous trajectories of $X_t(\omega)$. So we only have to prove that increments over the interval $[s,t]$ of the process $Y_t^x=<x,X_t>$ are normally distributed with 0-mean and variance $t-s$. First as $X_t$ is a multi-dimensional Gaussian processes whose increments are Gaussian vectors at all times, any linear combinations of its coordinate are one dimensional Gaussian random variable, so we are left to find the first two moments.

As $||x||^2=\sum_{i=1}^d(x_i)^2 = 1$, then :

$Y_t^x-Y_s^x=<x,X_t>-<x,X_s> =\sum_{i=1}^d x_i.(X_t^i-X_s^i )$

So $$\mathbb E[Y_t^x-Y_s^x ]=\mathbb E[\sum_{i=1}^d x_i.(X_t^i-X_s^i)]=\sum_{i=1}^d x_i.\mathbb E[[X_t^i-X_s^i ] =\sum_{i=1}^d x_i.0 = 0$$ Indeed as $X$ is a $BM^d$, each of its coordinates are a $BM^1$ of which increments have $0$ mean.

By the same the principle and as the increments have null means we also have: $$\mathbb{Var}[Y_t^x-Y_s^x ]=\mathbb E[(Y_t^x-Y_s^x )^2 ]-(\mathbb E[Y_t^x-Y_s^x ])^2 = \mathbb E[(\sum_{i=1}^d x_i.(X_t^i-X_s^i ))^2]-0 =\sum_{i=1}^d (x_i )^2.\mathbb E[(X_t^i-X_s^i)^2] + 2.\sum_{i,j=1,i\not = j}^d x_i.x_j.\mathbb E[(X_t^i-X_s^i).(X_t^j-X_s^j)]$$

But $\mathbb E[(X_t^i-X_s^i).(X_t^j-X_s^j)]=0$ and $\mathbb E[(X_t^i-X_s^i)^2] = t - s$, because $X$ is a $BM^d$ so we have: $$\mathbb{Var}[Y_t^x-Y_s^x ]=\sum_{i=1}^d [(x_i)^2.(t-s)]=(t-s).\sum_{i=1}^d (x_i )^2 = (t-s).||x||^2= (t-s).1= t - s$$

Last point left to prove is the independence of the increments. As $Y_t^x$ has been shown to have Gaussian increments it is sufficient to calculate the covariance of two non overlapping lags and check if this value is 0 to show our point. So let's do that for $s<t<u<v \in {\mathbb{R^+}}^4$ :

$$\mathbb{CoVar}[Y_t^x-Y_s^x,Y_t^v-Y_s^u ] = \mathbb E[(Y_t^x-Y_s^x ).(Y_v^x-Y_s^u ) ]-\mathbb E[Y_t^x-Y_s^x ].\mathbb E[Y_v^x-Y_u^x ] = \mathbb E[(\sum_{i=1}^d x_i.(X_t^i-X_s^i) ).\sum_{i=1}^d x_i.(X_v^i-X_u^i))]- 0$$ as it has been shown avoe that expectiaton of incremetns are null.Now comes developping the product : $$\mathbb{CoVar}[Y_t^x-Y_s^x,Y_t^u-Y_s^v ]=\sum_{i=1}^d (x_i)^2.\mathbb E[(X_t^i-X_s^i).(X_v^i-X_u^i)] + 2.\sum_{i,j=1,i\not = j}^d x_i.x_j.\mathbb E[(X_t^i-X_s^i).(X_u^j-X_v^j)]=0+0$$ Because $\forall i=1,...,d$ $E[(X_t^i-X_s^i).(X_v^i-X_u^i)]=0$ as $X^i$ are $BM^1$ for which non overlapping increments are independent by hypothesis and so have null cross expectations. The second term is also null because $\forall i,j=1,...,d, i\not=j$, $E[(X_t^i-X_s^i).(X_u^j-X_v^j)]=0$. This comes form the coordinate wise independence of the $BM^d $ rather than from the non overlapping increments.

QED

2 . This is the interesting part of the problem. So we start with $B$ a $BM^2$ and we define $X_t$ as in the hint by : $$X_t^1=B_{2t/3}^1-B_{t/3}^2$$ and $$X_t^2=B_{t/3}^1+B_{2t/3}^2$$ So first, calculations held later on will show that the increments $Y_t^x-Y_s^x$ are independently normally distributed with $\mathbb E[Y_t^x-Y_s^x]=0$ and $\mathbb{Var}[Y_t^x-Y_s^x ]=t-s$. Moreover the almost sure continuity of the trajectories of $Y_.^x$ proceeds from the same arguments as in the solution to question 1 above, so that $Y_t^x$ is indeed a one dimensional $BM^1$.

First for all $t>0$, as $Y_t^x=\sum_{i=1}^2 x_i.X_t^i = x_1.(B_{2t/3}^1-B_{t/3}^2)+ x_2.(B_{t/3}^1+B_{2t/3}^2)$, it is a linear combination of the 4 dimensional Gaussian process $(B_{2t/3}^1,B_{t/3}^2,B_{t/3}^1,B_{2t/3}^2)$, so $Y_t^x$ is itself Gaussian process and so are its increments $Y_t^x-Y_s^x$ extending the same idea to the vector $(B_{2t/3}^1-B_{2s/3}^1,B_{t/3}^2-B_{s/3}^2,B_{t/3}^1-B_{s/3}^1,B_{2t/3}^2-B_{2s/3}^2)$.

Now here are the calculations showing that the first two moments of the increments of $Y_t^x$ have the claimed value : $$\mathbb E[Y_t^x-Y_s^x ]=\mathbb E[\sum_{i=1}^2 x_i.(X_t^i-X_s^i )]=x_1.\mathbb E[X_t^1-X_s^1 ]+x_2.\mathbb E[X_t^2-X_s^2]$$ $$=x_1.\mathbb E[B_{2t/3}^1-B_{t/3}^2-B_{2s/3}^1+B_{s/3}^2 ]+x_2.\mathbb E[B_{t/3}^1+B_{2t/3}^2-B_{s/3}^1-B_{2s/3}^2 ]$$ $$=x_1.\mathbb E[B_{2t/3}^1-B_{2s/3}^1]-x_1.\mathbb E[B_{t/3}^2-B_{s/3}^2 ]+x_2.\mathbb E[B_{t/3}^1-B_{s/3}^1 ]+x_2.\mathbb E[B_{2t/3}^2-B_{2s/3}^2]$$ $~~~~=~~~~~~~~~~~~~~~~~~0~~~~~~~~~~~~~-~~~~~~~~~~~~~~~~0~~~~~~~~~~~~+~~~~~~~~~~~~0~~~~~~~~~~~~~~~~+~~~~~~~~~~~~0$

By the fact that $B_t^i, (i=1,2)$ are each $BM^1$.

Second for the variance we have: $$\mathbb{Var}[Y_t^x-Y_s^x ]=\mathbb E[(Y_t^x-Y_s^x )^2 ]= \mathbb E[(\sum_{i=1}^2 x_i.(X_t^i-X_s^i ) )^2 ]$$ $$=\sum_{i=1}^2 (x_i )^2.\mathbb E[(X_t^i-X_s^i )^2 ]+2.\sum_{i,j=1,i\not = j}^2 x_i.x_i.\mathbb E[(X_t^i-X_s^i ).(X_t^j-X_s^j)]$$

Now as $d=2$, we have: $$\mathbb{Var}[Y_t^x-Y_s^x ]= (x_1 )^2.\mathbb E[(X_t^1-X_s^1 )^2 ]+(x_2)^2.\mathbb E[(X_t^2-X_s^2 )^2 ]+$$ $$2.x_1.x_2.\mathbb E[(X_t^1-X_s^1).(X_t^2-X_s^2)]$$

And decomposing using $X_t^1=B_{2t/3}^1-B_{t/3}^2$ and $X_t^2=B_{t/3}^1+B_{2t/3}^2$ The first term becomes divided by $(x_1)^2$:

$$\mathbb E[(X_t^1-X_s^1 )^2]=\mathbb E[((B_{2t/3}^1-B_{t/3}^2)-(B_{2s/3}^1-B_{s/3}^2))^2]$$ $$=\mathbb E[(B_{2t/3}^1-B_{2s/3}^1)^2 ]+\mathbb E[(-B_{t/3}^2+B_{s/3}^2)^2]-2.\mathbb E[(B_{2t/3}^1-B_{2s/3}^1).(-B_{t/3}^2+B_{s/3}^2)]$$ $~~~~~~~~~~=~~~~~(2.(t-s))/3~~~~~~+~~~~((t-s))/3~~~~~~~~~~-2.~~~~~~~~~~~~~~~~~~~~~~~~~~0$

(by the law of $(B_{2t/3}^1-B_{2s/3}^1)$)+ (by the law of $(-B_{t/3}^2+B_{s/3}^2)$)-(by independence of $B_t^1$ and $B_t^2$)

The second term becomes divided by $(x_2)^2$: $$\mathbb E[(X_t^2-X_s^2)^2]=\mathbb E[((B_{2t/3}^2+B_{t/3}^1)-(B_{2s/3}^2+B_{s/3}^1))^2]$$ $$=\mathbb E[(B_{t/3}^1-B_{s/3}^1)^2]+\mathbb E[(B_{2s/3}^2-B_{2s/3}^2)^2]-2.\mathbb E[(B_{t/3}^1-B_{s/3}^1).(B_{2t/3}^2-B_{2s/3}^2)]$$ $~~~~~~~~~~~~~~=~~~~~((t-s))/3~~~~~+~~~~~~(2.(t-s))/3~~~~~-2.~~~~~~~~~~~~~~~~~~~~~~~~~~0$

By the same arguments as for the first term.

The third term becomes divided by $2.x_1.x_2$: $$\mathbb E[(X_t^1-X_s^1).(X_t^2-X_s^2)]=$$ $$ \mathbb E[((B_{2t/3}^1-B_{t/3}^2)-(B_{2s/3}^1-B_{s/3}^2)).((B_{2t/3}^2+B_{t/3}^1)-(B_{2s/3}^2+B_{s/3}^1))]$$ $$= \mathbb E[((B_{2t/3}^1-B_{2s/3}^1)-(B_{t/3}^2-B_{s/3}^2)).((B_{2t/3}^2-B_{2s/3}^2)+(B_{t/3}^1-B_{s/3}^1))]$$ $$= \mathbb E[(B_{2t/3}^1-B_{2s/3}^1).(B_{2t/3}^2-B_{2s/3}^2)]_{=0~by~indep.}+\mathbb E[(B_{2t/3}^1-B_{2s/3}^1).-(B_{s/3}^1-B_{t/3}^1)]$$ $$+\mathbb E[-(B_{t/3}^2-B_{s/3}^2).(B_{2t/3}^2-B_{2s/3}^2)]+\mathbb E[(B_{t/3}^2-B_{s/3}^2).(B_{t/3}^1-B_{s/3}^1)]_{=0~by~indep.~of~B^1~and~B^2}$$ $$=\mathbb E[((B_{2t/3}^1-B_{2s/3}^1)).(B_{t/3}^1-B_{s/3}^1)]-\mathbb E[(B_{2t/3}^2-B_{2s/3}^2).(B_{t/3}^2-B_{s/3}^2)]=0$$

as $B_t^1$ and $B_t^2$ have the same law.

So that finally: $\mathbb {Var}[Y_t^x-Y_s^x]=(x_1)^2.(t-s)+(x_2)^2.(t-s)+2.x_1.x_2.0=(t-s).||x||= (t-s).1=t-s$

And $Y_t^x$ is a $BM^1$ as claimed.

What is left to prove is independence of increments $\forall s<t<u<v \in {\mathbb{R^+}}^4$: $$\mathbb {CoVar}[Y_t^x-Y_s^x,Y_v^x-Y_u^x]=\mathbb E[(\sum_{i=1}^2 x_i.(X_t^i-X_s^i)).\sum_{i=1}^2 x_i.(X_v^i-X_u^i))]-\mathbb E[(\sum_{i=1}^2 x_i.(X_t^i-X_s^i) )].\mathbb E[(\sum_{i=1}^2 x_i.(X_v^i-X_u^i))]_{=0~by~the~calculations~above}$$ $$\mathbb {CoVar}[Y_t^x-Y_s^x,Y_v^x-Y_u^x]=(x_1)^2.\mathbb E[(X_t^1-X_s^1).(X_v^1-X_u^1)]_{=(1)}+(x_1.x_2.\mathbb E[(X_t^1-X_s^1).(X_v^2-X_u^2)]_{=(2)}+x_2.x_1.\mathbb E[(X_t^2-X_s^2).(X_v^1-X_u^1)]_{=(3)}+(x_2)^2.\mathbb E[(X_t^2-X_s^2).(X_v^2-X_u^2)]_{=(4)}$$ Now lets develop the expressions (1) to (4) of $X$ with respect to $B$ at length:

$$(1)=\mathbb E[((B_{2t/3}^1-B_{t/3}^2)-(B_{2s/3}^1-B_{s/3}^2)).((B_{2v/3}^1-B_{v/3}^2)-(B_{2u/3}^1-B_{u/3}^2))]$$ $$=\mathbb E[(B_{2t/3}^1-B_{2s/3}^1).(B_{2v/3}^1-B_{2u/3}^1)]_{=0~by~indep.~of~increments~on~non~overlapping~intervals B^1~and~B^2}$$ $$-\mathbb E[(B_{2t/3}^1-B_{2s/3}^1).(B_{v/3}^2-B_{u/3}^2)]_{=0~by~indep.~of~B^1~and~B^2}$$ $$+\mathbb E[(B_{t/3}^2-B_{s/3}^2).(B_{v/3}^2-B_{u/3}^2)]_{=0~by~indep.~of~increments~on~non~overlapping~intervals B^1~and~B^2}$$ $$-\mathbb E[(B_{t/3}^2-B_{s/3}^2).(B_{2v/3}^1-B_{2u/3}^1)]_{=0~by~indep.~of~B^1~and~B^2}$$ $$(1)=0$$

Now (2): $$(2)=\mathbb E[((B_{2t/3}^1-B_{t/3}^2)-(B_{2s/3}^1-B_{s/3}^2)).((B_{2v/3}^2+B_{v/3}^1)-(B_{2u/3}^2+B_{u/3}^1))]$$ $$=\mathbb E[((B_{2t/3}^1-B_{2s/3}^1)-(B_{t/3}^2-B_{s/3}^2)).((B_{2v/3}^2-B_{2u/3}^2)+(B_{v/3}^1-B_{u/3}^1))]$$ $$=\mathbb E[(B_{2t/3}^1-B_{2s/3}^1).(B_{v/3}^1-B_{u/3}^1)]_{=2t/3\wedge v/3 - 2t/3\wedge u/3- 2s/3\wedge v/3+ 2s/3\wedge u/3}$$ as here the relation $s<t<u<v$ do not always entail that [2/3s ,2/3t] and [u/3,v/3] are non overlapping intervals and here I use the fact that for a $BM^1,\mathbb E[(B_s.B_t]=s\wedge t$. $$+\mathbb E[(B_{2t/3}^1-B_{2s/3}^1).(B_{2v/3}^2-B_{2u/3}^2)]_{=0~by~indep.~of~B^1~and~B^2}$$ $$-\mathbb E[(B_{t/3}^2-B_{s/3}^2).(B_{2v/3}^2-B_{2u/3}^2)]_{=0~by~indep.~of~increments~on~non~overlapping~intervals~of~B^1~and~B^2}$$ $$-\mathbb E[(B_{t/3}^2-B_{s/3}^2).(B_{v/3}^1-B_{u/3}^1)]_{=0~by~indep.~of~B^1~and~B^2}$$

Then (3) : $$(3)=\mathbb E[((B_{2t/3}^2+B_{t/3}^1)-(B_{2s/3}^2+B_{s/3}^1)).((B_{2v/3}^1-B_{v/3}^2)-(B_{2u/3}^1-B_{u/3}^2))]=0$$ $$=\mathbb E[((B_{2t/3}^2-B_{2s/3}^2)+(B_{t/3}^1-B_{s/3}^1)).((B_{2v/3}^1-B_{2u/3}^1)-(B_{v/3}^2-B_{u/3}^2))]$$ $$=\mathbb E[(B_{2t/3}^2-B_{2s/3}^2).(B_{2v/3}^1-B_{2u/3}^1)]_{=0~by~indep.~of~B^1~and~B^2}$$ $$-\mathbb E[(B_{2t/3}^2-B_{2s/3}^2).(B_{v/3}^2-B_{u/3}^2)]_{=-(2t/3\wedge v/3-2t/3\wedge u/3- 2s/3\wedge v/3+ 2s/3\wedge u/3)}$$ as here the relation $s<t<u<v$ do not always entail that [2/3s ,2/3t] and [u/3,v/3] are non overlapping intervals and here I use the fact that for a $BM^1,\mathbb E[(B_s.B_t]=s\wedge t$. $$+\mathbb E[(B_{t/3}^1-B_{s/3}^1).(B_{2v/3}^1-B_{2u/3}^1)]_{=0~by~indep.~of~increments~on~non~overlapping~intervals B^1~and~B^2}$$ $$-\mathbb E[(B_{t/3}^1-B_{s/3}^1).(B_{v/3}^2-B_{u/3}^2)]_{=0~by~indep.~of~B^1~and~B^2}$$

and now (4) : $$(4)=\mathbb E[((B_{2t/3}^2+B_{t/3}^1)-(B_{2s/3}^2+B_{s/3}^1)).((B_{2v/3}^2+B_{v/3}^1)-(B_{2u/3}^2+B_{u/3}^1))]$$ $$=\mathbb E[((B_{2t/3}^2 - B_{2s/3}^2 ) + (B_{t/3}^1-B_{s/3}^1)).((B_{2v/3}^2-B_{2u/3}^2) + (B_{v/3}^1-B_{u/3}^1))]$$ $$=\mathbb E[(B_{2t/3}^2 - B_{2s/3}^2 ).(B_{2v/3}^2-B_{2u/3}^2)]_{=0~by~indep.~of~increments~on~non~overlapping~intervals B^1~and~B^2}$$ $$+\mathbb E[(B_{2t/3}^2 - B_{2s/3}^2 ).(B_{v/3}^1-B_{u/3}^1)]_{=0~by~indep.~of~B^1~and~B^2}$$ $$+\mathbb E[(B_{t/3}^1-B_{s/3}^1).(B_{2v/3}^2-B_{2u/3}^2)]_{=0~by~indep.~of~B^1~and~B^2}$$ $$+\mathbb E[(B_{t/3}^1-B_{s/3}^1).(B_{v/3}^1-B_{u/3}^1)]_{=0~by~indep.~of~increments~on~non~overlapping~intervals B^1~and~B^2}$$

So $$(1)+(2)+(3)+(4)= 2t/3\wedge v/3 - 2t/3\wedge u/3- 2s/3\wedge v/3+ 2s/3\wedge u/3 - (2t/3\wedge v/3-2t/3\wedge u/3- 2s/3\wedge v/3+ 2s/3\wedge u/3)=0$$

And oh man we are done with this tedious calculations !

What is left to show is that $X_t$ is not a $BM^2$.

To see this let’s calculate the covariance of the marginal of $X$ at two different times to check whether or not there exists a linear interaction at the level of the marginal, so we have:

$$\mathbb {CoVar}[X_t^1,X_s^2]=\mathbb E[X_t^1.X_s^2]-\mathbb E[X_t^1].\mathbb E[X_s^2]=\mathbb E[X_t^1.X_s^2]$$ (as $\mathbb E[X_t^i]=0$) $$=\mathbb E[(B_{2t/3}^1-B_{t/3}^2).(B_{s/3}^1+B_{2s/3}^2)]$$ (by definition of $X$) $$= \mathbb E[B_{2t/3}^1.B_{s/3}^1]-\mathbb E[B_{t/3}^2.B_{s/3}^1]+\mathbb E[B_{2t/3}^1.B_{2s/3}^2]-\mathbb E[B_{t/3}^2.B_{2t/3}^2]$$ (by simply developing the product) $$= \mathbb E[B_{2t/3}^1.B_{s/3}^1]-\mathbb E[B_{t/3}^2.B_{2t/3}^2]=-(t-s)/3$$ As $\mathbb E[B_u^i.B_v^i]=u∧v$ for one dimensional $BM$ and as $B_t^1$ and $B_t^2$ are independent, this implies that $\mathbb E[B_{t/3}^2.B_{s/3}^1]=0$ and $\mathbb E[B_{2t/3}^1.B_{2s/3}^2]=0$.

So we see that $\mathbb {CoVar}[X_t^1,X_s^2 ]=-(t-s)/3$ is not zero, so that its marginal have zero chance to be independent (even though $\mathbb {CoVar}[X_t^1,X_t^2 ]=0$ !).

So we have proven that even though the projection on the sphere of $X$ behave like the projection of a regular $BM^2$ do, as its marginals are not independent, this is not a $BM^2$ and the property is not enough to characterize a $BM^d$.