Continous time stochastic process. Proving martingale property.

116 Views Asked by At

Assume $\left\lbrace X(t) \right\rbrace _{t≥0} $ is a contionous time martingale.

Under what additional constraints is $\left\lbrace X(t)^2 \right\rbrace _{t≥0}$ , a martingale?

Attempt of solution:
$$ \begin{align}E[X(t)^2 | F_s] &= E[(X(t) −X(s))^2 | F_s] + 2 E[(X(t)−X(s)) X(s)| F_s] + E[X(s)^2|F_s] \\ &= E[(X(t)−X(s))^2|F_s] + 2 E[X(t)−X(s)|F_s] X(s) + X(s)^2 \end{align}$$

because $X(s)$ and $X(s)^2$ are $F_s$ measurable for $0 \leq s \leq t$. How do I proceed? I should use some kind of what is called tower property for the E[(X(t)−X(s))^2|F_s] term. Can´t really figure out how it works? Someone who could explain?

1

There are 1 best solutions below

1
On BEST ANSWER

You know that the pair $(X,\mathcal{F})$ is a martingale, where $\mathcal{F}$ denotes a filtration. You want to investigate when the pair $(X^2,\mathcal{F})$ will be a martingale also.

  • In order for $(X^2,\mathcal{F})$ to be a martingale, process $X$ must have constant variance; this does not follow from $X$ having zero mean.
  • Process $X^2$ is adapted to the filtration $\mathcal{F}\ ;$ this follows from process $X$ being adapted to $\mathcal{F}\ .$

Pick any two times $s$ and $t$ such that $0<s<t$ and consider the conditional expectation $\mathbf{E}(X^2_{t}\vert \mathcal{F}_{s})\ .$ This can be expressed as follows.

$$\mathbf{E}(\{(X_{t}-X_{s})+X_{s}\}^2\vert \mathcal{F}_{s}) = \mathbf{E}((X_{t}-X_{s})^2+2X_{s}(X_{t}-X_{s})+X_{s}^2\vert \mathcal{F}_{s})\ .$$

  • If increments $X_{t}-X_{s}$ are independent of the sigma-algebra $\mathcal{F}_{s}$ then the constancy of the mean of $X$ implies that $\mathbf{E}(X^2_{t}\vert \mathcal{F}_{s}) = \mathbf{E}((X_{t}-X_{s})^2) + X_{s}^2 \ .$
  • If increments are time invariant, and $X_{0} = 0\ ,$ then $\mathbf{E}((X_{t}-X_{s})^2) = \mathbf{E}(X_{t-s}^2)$.

In order for such a process to be a martingale you must require that $\mathbf{E}(X_{t-s}^2) = 0\ ,$ implying that process $X$ is constant (equal to $0$).

If you are unwilling to assume time invariance, then you can assert $\mathbf{E}((X_{t}-X_{s})^2) \geq 0$ implying that $$\mathbf{E}(X^2_{t}\vert \mathcal{F}_{s}) \geq X_{s}^2\ ,$$ i.e. the pair $(X^2,\mathcal{F})$ is a submartingale.

December 19, 2015. The following is a response to JKnecht.

Since conditional expectation is linear you have $$\mathbf{E}((X_{t}-X_{s})^2+X_{s}^2\vert\mathcal{F}_s) = \mathbf{E}((X_{t}-X_{s})^2\vert\mathcal{F}_s)+\mathbf{E}(X_{s}^2\vert\mathcal{F}_s)\ .$$ Since $X_{s}$ is measurable with respect to $\mathcal{F}_{s}$ you know that $$\mathbf{E}(X_{s}^2\vert\mathcal{F}_s) = X_{s}^2 \ .$$ If increments are independent then $(X_{t}-X_{s})^2$ is independent of $\mathcal{F}_s$ (since $s<t$), which implies $$\mathbf{E}((X_{t}-X_{s})^2\vert\mathcal{F}_s) = \mathbf{E}((X_{t}-X_{s})^2)\ .$$ If increments have a stationary distribution then $$\mathbf{E}((X_{t}-X_{s})^2) = \mathbf{E}((X_{t-s}-X_{0})^2)\ . $$