Martingale, decomposition, expectancy value

91 Views Asked by At

I am trying to understand a proof in my book but i can't figure out a few things. Definitions first: $(\Delta X)_n :=X_n-X_{n-1}$ with $X_{\alpha-1}:=X_\alpha$, if $\alpha>-\infty$, so $(\Delta X)_\alpha=0$ if $\alpha>-\infty$.

A process $H=(H_n)_{n\in T}$ is called $\mathbb{F}$-predictable if $H_n$ is $\mathcal{F}_{n-1}$-measurable for all $n\in T$. Let $\alpha>-\infty$, $H$ be a $\mathbb{F}$-predictable real process and $X$ a $\mathbb{F}$-adapted real process, then $$(H\circ X)_n:= \sum^n_{j=\alpha+1}H_j\Delta X_j, n\in T$$ is called h-transform of $X$.

Let $\alpha>-\infty$. The process $$[X,Y]_n:=\sum^n_{i=\alpha +1}\Delta X_i\Delta Y_i$$ is called covariation for $\mathbb{F}$-adapted real processes $X$ and $Y$ and $[X]:=[X,X]$ is called square variation of $X$.

And here are the equations i don't understand.

"we can conclude for the martingale $H_\alpha M_\alpha +H\circ M$ \begin{align*} E(H_\alpha M_\alpha +(H\circ M)_n)^2 &=E(H_\alpha M_\alpha)^2+E[H\circ M]_n\\ &=E(H_\alpha M_\alpha)^2+EH^2\circ [M]_n\\ &\le EM^2_\alpha+E[M]_n =EM^2_n" \end{align*} I don't understand the first equation. How does he get there from the left side? And i am not quite sure about the last equation. Does it hold because $M$ is a martingale? Does that mean that $E[M]_n=0$? And is $EM^2_\alpha = EM^2_n$ because M is a martingale which means that the prize i get from a fair game of chance will be the same (in average)?

1

There are 1 best solutions below

3
On

The notation is for my taste unfriendly and against the common sense. It is hard for instance to digest $E[H\circ M]_n$, which is with a low expectation without reading all conventions a "second order expression" in $H,M$, explicity: $$ E[H\circ M]_n = E\Big[\qquad [H\circ M,H\circ M]_n\qquad \Big]\ . $$ To have a clear situation, i will use $\alpha=0$. I suppose $M$ is a martingale. Then we have $$ \begin{aligned} \Delta(H\circ M)_n &= \sum_{n-1<j\le n}H_j\Delta M_j \\ &=H_n(M_n-M_{n-1}) \\ [H\circ M,H\circ M]_n &= \sum_{0<j\le n} \Delta(H\circ M)_j \cdot \Delta(H\circ M)_j \\ &= \sum_{0<j\le n} H_j(M_j-M_{j-1}) \cdot H_j(M_j-M_{j-1}) \ , \\ E[H\circ M]_n &= E\Big[\qquad [H\circ M,H\circ M]_n\qquad \Big]\ . \\ &= E\left[\ \sum_{0<j\le n} H_j^2(M_j-M_{j-1})^2 \ \right]\ . \\ &\qquad\qquad\text{Now let us compute} \\ E\left[\ \left(H_0M_0 + (H\circ M)_n\right)^2\ \right] &= E[\ (H_0M_0)^2\ ] \\&\qquad +2\underbrace{E\left[\ H_0M_0 (H\circ M)_n\ \right]}_{=0} \\&\qquad\qquad + E\left[\ \left( (H\circ M)_n\right)^2\ \right] \\ &= E[\ (H_0M_0)^2\ ] \\&\qquad + E\left[\ \sum_{0<j\le n}H_j^2(M_j-M_{j-1})^2\ \right] \\&\qquad\qquad + \underbrace{ 2E\left[\ \sum_{0<j<k\le n}H_j(M_j-M_{j-1})\cdot H_k(M_k-M_{k-1})\ \right] }_{=0} \ . \end{aligned} $$ The reason for the vanishing of some expectations is as follows, one example, this should be enough to capture the idea. Assume $j<k$ and let us calculate the expectation of $$ H_j(M_j-M_{j-1})\cdot H_k(M_k-M_{k-1})\ . $$ This factors throught the computatioin of the conditional expectaction $$ \begin{aligned} &E\Big[\ H_j(M_j-M_{j-1})\cdot H_k(M_k-M_{k-1})\ \Big| \ \Bbb F_{k-1}\ \Big] \\ = &H_j(M_j-M_{j-1})\cdot H_k\cdot \underbrace{ E\Big[\ M_k-M_{k-1}\ \Big| \ \Bbb F_{k-1}\ \Big] }_{=0} \\ = &0 \ . \end{aligned} $$ (We use the fact that $M$ is a martingale. The $H_k$ could jump outside the conditional expectation, because of predictability.)

This clears the first equation.


For the last step we need of course more information on $H$. Of course, things are false, if for instance $H=2018$, a constant process.