Using Martingale Convergence to prove a zero-one law

379 Views Asked by At

This is exercise 10 from 27'th chapter of "Probability essentials" by Jacod & Protter.

Let $(F_n)$ be an increasing sequence of sigma-algebras and $(G_n)$ a decreasing sequence of sigma- algebras, with $G_1 \subset \sigma(\cup_{n=1}^{\infty}F_n)$· Suppose that $F_n$ and $G_n$ are independent for each $n$. Show that if $A\in \cap_{n=1}^{\infty}G_n$, then $P(A) = 0 $ or $1$.

I defined $Y_n=E(I_{A}|F_n)$ and $Z_n=E(I_{A}|G_n)$ as martingales,by martingales convergence theorem, easily both of them converges to $E(I_{A}) = P(A)$, but how to use independence of $F_n$'s and $G_n$'s and complete the proof?

Thank you very much.

1

There are 1 best solutions below

0
On BEST ANSWER

$A$ is independent of $F_n$ for each $n$. So $Y_n = E(I_A |F_n) = P(A)$. So in the limit, we have $Y_{\infty} = E(I_A | F_{\infty}) = P(A)$. So $A$ is also independent of $F_{\infty} = \bigcup F_n$. But since $A \in G_1 \subset F_{\infty}$, it is independent of itself, hence has probability zero or one.

I think it's okay. What do you think?