Independence in conditional expectation of martingales

325 Views Asked by At

I am stuck in a problem and would appreciate any guidance/ hints.

We denote $\mathcal F_t$ as such: if $X_t$ is a stochastic process, the minimum amount of information resulted from knowing the process $X_s$ until time $t$ is denoted by $\mathcal F_t = \sigma(X_s: s \leq t)$. This is the $\sigma$-algebra generated by the events $\{\mathcal w:X_s(\mathcal w)\in(a,b)\}$ for any real numbers $a < b$ and $s \leq t$

This is the question: Let $X_n, \, n\geq 0$ be a sequence of independent, integrable random variables such that $E[X_n] = 1$ for n $\geq 0$. Prove that $P_n = X_0X_1\cdots X_n$ is an $\mathcal F_n$-martingale.

I have established $P_n$ is integrable, and is also $\mathcal F_n$-measurable. However, I am stuck in showing $$X_n = E[X_t|\mathcal F_n]\quad\forall\, n < t$$

By any chance is $E[X_{n+k}|\mathcal F_n]=E[X_{n+k}]$ for all natural number $k$? This is the part I can't quite figure out.

  • Let's suppose $E[X_{n+k}|\mathcal F_n]=E[X_{n+k}]$ is true. That means $X_{n+k}$ is independent of $\mathcal F_n = \sigma(X_0, X_1,\cdots, X_n)$. I suspect this is related to the given fact that $X_n$ is a sequence of independent random variables but I cannot see how.

If it is, then my thought process would be: $$E[P_{n+k}|\mathcal F_n] = E[P_nX_{n+1}\cdots X_{n+k}|\mathcal F_n] = P_nE[X_{n+1}\cdots X_{n+k}|\mathcal F_n] = P_nE[X_{n+1}]\cdots E[X_{n+k}] = P_n$$

Thanks!