Coming from a physics background, I am quite new to martingales, and now am trying to grasp some concepts regarding $\mathcal{L}^2$-martingales. An exercise I was given in my lecture course is as follows:
Let $\{M_n\}_{n\geq0}$ be a martingale in $\mathcal{L}^2(\Omega,\mathcal{F},\mathbb{P})$, and $\{H_n\}_{n\geq0}$ be a sequence of $\mathcal{F}_n$-measurable bounded random variables, where $\mathcal{F}_n$ is the natural filtration of $M_n$. Define $I_n:=\sum_{k=1}^nH_{k-1}(M_k-M_{k-1})$.
a) Prove that $\mathbb{E}(M_n^2-M_{n-1}^2)=\mathbb{E}[(M_n-M_{n-1})^2]$.
b) Prove that $\{I_n\}_{n\geq1}$ is a martingale in $\mathcal{L}^2(\Omega,\mathcal{F},\mathbb{P})$.
c) Show that $E(I_n^2)=\sum_{k=1}^n\mathbb{E}[H_{k-1}^2(M_k-M_{k-1})^2]$.
My understanding is that increments are orthogonal, so if $m\leq n$ and $X\in\mathcal{F}_m$ with $\mathbb{E}(X)<\infty$, then $\mathbb{E}[X(M_n-M_m)]=0$. Therefore for a), if I expanded the RHS, $\mathbb{E}[(M_n-M_{n-1})^2]=\mathbb{E}(M_n^2-M_{n-1}^2+2M_{n-1}M_n])=\mathbb{E}(M_n^2-M_{n-1}^2)+2\mathbb{E}(M_{n-1}M_n)=\mathbb{E}(M_n^2-M_{n-1}^2)$, but does this mean demanding that $M_0=0$ a.s.? (I have the same query when applying orthogonality for part c) as well.)
As for part b), I am not sure how to apply the information I have about $\{M_n\}$ and $\{H_n\}$ (square integrable, measurable, $H$ is bounded). I am not sure how to prove the boundedness in both $\mathcal{L}^1$ and $\mathcal{L}^2$, could someone point me in the right direction? I know I have to use the boundedness of $H$ but can’t quite figure out where... For proving the martingale property, I used an iffy "taking out what is known" argument on $H_n$ as it is adapted to $\mathcal{F}_n$, but I think I might be just be dumping in random things I learnt about martingales at this point... could someone also provide some inspiration for this too?
My last question is, what's so special about $\mathcal{L}^2$-martingales, or generally $\mathcal{L}^p$ martingales? (All I know is that $X_n\overset{\mathcal{L}^p}{\to}X$ if $\mathbb{E}(|X|^p)<\infty$ for $p>1$ (and for $p=1$ we require them to be UI), but have not seen much beyond this.) Does it have anything to do with quadratic variations in Brownian Motion?
Let $\{M_n\}$ be a square integrable martingale with respect to filtration $(\mathcal F_n)$, let $(H_n)$ be $(F_n)$ adapted process (that is $H_n$ is $\mathcal F_n$ measurable)
Let $I_n = \sum_{k=1}^{n} H_{k-1}(M_k - M_{k-1})$
We want to show $3$ things:
a) Expanding the square we get $(M_n - M_{n-1})^2 = M_n^2 + M_{n-1}^2- 2M_nM_{n-1}$, so that:
$$ \mathbb E[(M_n - M_{n-1})^2] = \mathbb E[M_n^2] + \mathbb E[M_{n-1}^2] - 2\mathbb E[M_n M_{n-1}]$$
But by martingale property $\mathbb E[M_nM_{n-1}] = \mathbb E[\mathbb E[M_n M_{n-1} | \mathcal F_{n-1}] = \mathbb E[M_{n-1}\mathbb E[M_n \mathcal F_{n-1}]] = \mathbb E[M_{n-1}^2]$, hence $$ \mathbb E [(M_n-M_{n-1})^2] = \mathbb E[M_n^2] - \mathbb E[M_{n-1}^2]$$
b) We need to show that it is a martingale in $L_2$, so that it needs to satisfy three things.
$I_n$ is $\mathcal F_n$ measurable. Indeed, every $H_k$ and $M_k$ for $k \in \{1,...,n\}$ are $\mathcal F_n$ measurable, so $I_n$ as product/sum of those is $\mathcal F_n$ measurable.
$I_n$ is square-integrable. Note that since every $H_k$ is bounded random variable, then there exists $M$ such that $|H_1|,...,|H_n| < M$ almost surely, hence $$ \mathbb E[I_n^2] \le \mathbb E[ (\sum_{k=1}^n |H_{k-1}||M_k - M_{k-1}|)^2] \le M^2 \mathbb E[(\sum_{k=1}^n |M_k - M_{k-1}|)^2].$$ Expanding the square we get finitelly many terms of form $(M_k - M_{k-1})^2$ or $(M_k-M_{k-1})(M_j-M_{j-1})$ so in fact it would be enough to prove that $M_jM_k$ is integrable for any $j,k \in \{1,...,n\}$ (hence then our $I_n^2$ was bounded by finite sum of such terms, so would be integrable). But for $j \le k$ we get $\mathbb E[M_kM_j] = \mathbb E[M_j \mathbb E[M_k|\mathcal F_j]] = \mathbb E[M_j^2] < \infty$, since $(M_n)$ was square integrable.
Conditional property. Take any $n \in \mathbb N$. We have: $$ \mathbb E[I_n|\mathcal F_{n-1}] = \sum_{k=1}^n \mathbb E[H_{k-1}(M_k-M_{k-1})|\mathcal F_{n-1}] = \sum_{k=1}^{n-1} H_k(M_k - M_{k-1}) + \mathbb E[H_{n-1}(M_n - M_{n-1}) | \mathcal F_n] $$
The first sum is just $I_{n-1}$ and the last term is $0$, cause due to measurability we can take $H_{n-1}$ out of conditional expectation, and via martingale property, $\mathbb E[M_n - M_{n-1} | \mathcal F_{n-1}] = 0$
c) Writtin $I_n^2$ similarly as in b), we see that it would be enough to prove that for $j < k$ we have $\mathbb E[(M_k - M_{k-1})(M_j - M_{j-1})] = 0$ (cause then only "diagonal" terms will "survive"). And to see it, just condition on $\mathcal F_{j}$ and take measurable $(M_j-M_{j-1})$ out of inner expectation and then using martingale property inside, getting:
$$ \mathbb E[(M_k-M_{k-1})(M_j-M_{j-1})] = \mathbb E[(M_j-M_{j-1})\mathbb E[(M_k-M_{k-1})|\mathcal F_j]] = \mathbb E[(M_j-M_{j-1})\cdot 0] = 0$$