Why is $Y_t = {B_t}^2 - t$ a martingale?

931 Views Asked by At

Any help on the following will be much appreciated!

I am trying to prove that $Y_t = {B_t}^2 - t$ is a martingale, where $B_t$ is a Brownian motion with respect to the sigma field $\mathcal F_s$.

However, I do not understand why $$(B_t - B_s)^2, \,s < t$$ is independent of $\mathcal F_s$.

  • In fact, I do not fully appreciate what $B_t$ having independent increments means...

Nevertheless, I know $Y_s = E[Y_t|\mathcal F_s]$ can be shown as such:

Consider \begin{align} E[{B_t}^2|\mathcal F_s]&= E[(B_s + B_t - B_s)^2|\mathcal F_s] \\&= E[{B_s}^2 + 2B_s(B_t - B_s) + (B_t - B_s)^2|\mathcal F_s] \\& = E[{B_t}^2|\mathcal F_s] + 2B_sE[B_t - B_s|\mathcal F_s] +E[(B_t - B_s)^2|\mathcal F_s] \\&= {B_s}^2 + 2B_sE[B_t - B_s]+E[(B_t - B_s)^2] \\& = {B_s}^2 + t -s \end{align}

then consider $E[{B_t}^2 - t|\mathcal F_s]$ and we're done.

Thanks!

2

There are 2 best solutions below

5
On BEST ANSWER

Independent increments means that for $t>s$, the random variable $B_t-B_s$ is independent of the $\sigma$-algebra at time $s$, i.e. independent of $\mathcal{F}_s=\sigma(\cup_{v\leq s} B_v)$. In fact, for Brownian motion, $B_t-B_s\sim \mathcal{N}(0,t-s)$ and is independent of $\mathcal{F}_s$. Of course, this implies $(B_t-B_s)^2$ is also independent of $\mathcal{F}_s$. As for why this is the case, it will follow from whatever construction of Brownian motion was used.

Intuitively, this is saying that knowing the entire history of the Brownian motion up until time $s$, the difference between where it is in the future at time $t$ and where it is now is normally distributed with variance proportional to the length of the increment.

0
On

Since you mention that you do not fully understand what "independent increments" means, I believe it is more appropriate to start with a simpler example that illustrates the idea.

Instead of a Brownian motion, consider its discrete analogue: the random walk. I will start from a sequence $X_1,X_2,\ldots$ consisting of independent random variables. This means that $$ \mathbb P(X_1\in A_1,X_2\in A_2)=\mathbb P(X_1\in A_1)\mathbb P(X_2\in A_2), $$ and also $$ \mathbb P(X_1\in A_1,X_2\in A_2,X_3\in A_3)=\mathbb P(X_1\in A_1,X_3\in A_3)\mathbb P(X_2\in A_2), $$ and so on for every possible way of grouping the variables together.

Probabilists call the sequence of partial sums $$S_n=\sum_{i=1}^n X_i$$ a random walk. Historically, Brownian motion arose as a scaling limit of certain random walks - think of how the real numbers are obtained from the integers by rescaling (to get rationals) and taking limits (to get real numbers).

Independence of increments is a property that both the random walk and the Brownian motion possess. The increments of the random walk are simply the differences $S_n-S_m$ for $m<n$. By definition, we have that $$ S_n-S_m=\sum_{i=m+1}^n X_n. $$ Any two non-overlapping increments, say $$ S_n-S_m\text{ and }S_{n'}-S_{m'}\text{ with }m<n<m'<n', $$ are independent of each other. Why? Because they are sums over disjoint sets of random variables $X_i$, which we assumed were independent of one another. This is what is meant by independence of increments.

Increments of Brownian motion are a continuum analogue of increments of random walk, but suffice it to say that precisely the same phenomenon of independent increments applies to both cases. Although it may take more work to construct the Brownian motion, at the end of the day the intuitive picture from the case of random walk completely carries through.

Returning to your question of how independence of increments plays a role in the proof that $B_t^2-t$ is a martingale, it is the step where you expand the binomial $(B_s+(B_t-B_s))^2$. The squared terms stay, but the cross terms go away due to independence of increments. More precisely, $\mathbb E(B_s(B_t-B_s)\mid \mathcal F_s)=0$ due to the fact that $B_s$ and $B_t-B_s$ are independent, as they represent increments over the (essentially) disjoint intervals $[0,s]$ and $[s,t]$. This means that the conditional expectation is constant: $$\mathbb E(B_t-B_s\mid B_s)=\mathbb E(B_t-B_s).$$ Finally, since $B_t-B_s$ has the distribution of a centered Gaussian random variable, its mean is zero, and thus $\mathbb E(B_t-B_s\mid B_s)=0$.