If $H$ is predictable, show $ M_t = \sum_{s=1}^tH_s(X_s-E(X_s)) $ is a martingale

136 Views Asked by At

$$ M_t = M_0 + \sum_{s=1}^tH_s(X_s-E(X_s)) $$ where $M_0=0,$ $H$ is a square integrable predictable process and $(X_t)$ are a sequence of square integrable iid random variables.

a) Show $M$ is a martingale.

b) Show $E(M_t^2) = \sum_{s=1}^\infty H_s^2(E(X_s^2)-E(X_s)^2) $.

So, since $H$ is square integrable, then

$$\sum_{s=1}^t E(H_s^2)(E(X_s^2)-E(X_s)^2) <\infty$$

For standard procedures of finding martingales, we find $E(M_t|\mathcal{F}_s)=M_s,$ but here it seems a little difficult to do, and I think for square integrable processes, we need to show something else? I found online that they possess the following property:

$$ \mathbb{E}((X_u - X_t)X_s)=0 \quad \text{and} \quad \mathbb{E}((X_t-X_s)^2|\mathcal{F}_s) = \mathbb{E}(X_t^2|\mathcal{F}_s)-X_s^2$$

for $s\le t \le u$. Is this what I need to prove?

I suppose predictability of $H$ will help with this question where it is $\mathcal{F}_{t-1}-$measurable. Also, a sidenote, why is $M_0$ defined here if it $M_0=0$..?

b) I guess i need to use since $H$ is square integrable then we can use $\sum_{s=1}^\infty E(H_s^2)(E(X_s^2)-E(X_s)^2)$, and then the only difference is that it suggests $E(H_s^2) = H_s^2$ because of predictability, is this right?

1

There are 1 best solutions below

4
On BEST ANSWER

For a), we only need the square integrability of $H$ and $X$ to make sure that $M$ is integrable. To check the martingale property for discrete time processes, it's enough to show $\mathbb{E}[M_{t+1}|\mathcal{F}_{t}] = M_t$. We compute

\begin{align*} \mathbb{E}[M_{t+1}|\mathcal{F}_{t}] &= \mathbb{E}[M_{t} + H_{t+1}(X_{t+1} - \mathbb{E}[X_{t+1}])|\mathcal{F}_{t}] \\ &= M_t + \mathbb{E}[H_{t+1}(X_{t+1} - \mathbb{E}[X_{t+1}])|\mathcal{F}_{t}] \\ &= M_t + H_{t+1} \mathbb{E}[(X_{t+1} - \mathbb{E}[X_{t+1}])|\mathcal{F}_{t}] \\ &= M_t + H_{t+1}(\mathbb{E}[X_{t+1}|\mathcal{F}_t] - \mathbb{E}[X_{t+1}]) \\ &= M_t \end{align*}

so $M$ is a martingale.

You're right that it doesn't make much sense to include $M_0$ in the definition only to set it to $0$ immediately afterwards. The only reason I can think of for that is to slightly simplify the problem? But it really isn't that much different regardless.

For b), I'm fairly sure there's a typo in the question. Even though $H$ is predictable, there is no reason that $\mathbb{E}[H_s^2] = H_s^2$. Predictable processes can still be random, so the only way this would hold true is if $H$ is deterministic. You can show $\mathbb{E}[M_t^2] = \sum_{s=1}^t\mathbb{E}[H_s^2](\mathbb{E}[X_s^2]-\mathbb{E}[X_s]^2)$ but what's stated in the problem doesn't work because $\mathbb{E}[M_t^2]$ is non-random while the right hand side is random.

To show $\mathbb{E}[M_t^2] = \sum_{s=1}^t\mathbb{E}[H_s^2](\mathbb{E}[X_s^2]-\mathbb{E}[X_s]^2)$, we compute

\begin{align*} \mathbb{E}[M_t^2] &= \mathbb{E}\left[\left(\sum_{s=1}^t H_s(X_s-\mathbb{E}[X_s])\right)^2\right] \\ &= \mathbb{E}\left[\sum_{s=1}^t (H_s(X_s-\mathbb{E}[X_s]))^2 + 2\sum_{s=1}^t\sum_{j=1}^{s-1}H_s(X_s-\mathbb{E}[X_s])H_j(X_j-\mathbb{E}[X_j])\right] \\ &= \sum_{s=1}^t \mathbb{E}[(H_s(X_s-\mathbb{E}[X_s]))^2] + 2\sum_{s=1}^t\sum_{j=1}^{s-1}\mathbb{E}[H_s(X_s-\mathbb{E}[X_s])H_j(X_j-\mathbb{E}[X_j])]. \end{align*}

For the terms in the second sum, we can use that $j \le s-1$ and $H_s$ is $\mathcal{F}_{s-1}$ measurable to compute \begin{align*}\mathbb{E}[H_s(X_s-\mathbb{E}[X_s])H_j(X_j-\mathbb{E}[X_j])] &= \mathbb{E}\bigg[\mathbb{E}[H_s(X_s-\mathbb{E}[X_s])H_j(X_j-\mathbb{E}[X_j])|\mathcal{F}_{s-1}]\bigg]\\ &= \mathbb{E}\bigg[H_sH_j(X_j-\mathbb{E}[X_j])\mathbb{E}[(X_s-\mathbb{E}[X_s])|\mathcal{F}_{s-1}]\bigg] \\ &= \mathbb{E}\bigg[H_sH_j(X_j-\mathbb{E}[X_j])(\mathbb{E}[X_s]-\mathbb{E}[X_s])\bigg] \\ &= 0. \end{align*}

For the first sum, we use that $H_s$ is $\mathcal{F}_{s-1}$ measurable again to compute

$$\mathbb{E}[H_s^2(X_s-\mathbb{E}[X_s])^2] = \mathbb{E}[\mathbb{E}[H_s^2(X_s-\mathbb{E}[X_s])^2|\mathcal{F}_{s-1}]] = \mathbb{E}[H_s^2\mathbb{E}[(X_s-\mathbb{E}[X_s])^2|\mathcal{F}_{s-1}]] = \mathbb{E}[H_s^2\mathbb{E}[(X_s^2-\mathbb{E}[X_s]^2)]]=\mathbb{E}[H_s^2](\mathbb{E}[X_s^2]-\mathbb{E}[X_s]^2).$$

Putting everything back together, we have

\begin{align*} \mathbb{E}[M_t^2] &= \sum_{s=1}^t \mathbb{E}[(H_s(X_s-\mathbb{E}[X_s]))^2] + 2\sum_{s=1}^t\sum_{j=1}^{s-1}\mathbb{E}[H_s(X_s-\mathbb{E}[X_s])H_j(X_j-\mathbb{E}[X_j])] \\ &=\sum_{s=1}^t \mathbb{E}[H_s^2](\mathbb{E}[X_s^2]-\mathbb{E}[X_s]^2)] \end{align*}