$E[X_{n+1}\mid X_1,\ldots,X_{n}] =X_{n} \implies E[X_{n+1}] = E[X_n] = \ldots = E[X_2] = E[X_1]$

67 Views Asked by At

I saw from https://math.dartmouth.edu/~pw/math100w13/lalonde.pdf that

$$E[X_{n+1}\mid X_1,\ldots,X_{n}] =X_{n} \implies E[X_{n+1}] = E[X_n] =\ldots = E[X_2] = E[X_1]$$

Does this implication hold in general or only for a Martingale process? How do we show $E[X_{n+1}] = E[X_n] =\ldots = E[X_2] = E[X_1]$

After some thinking, if I'm not mistaken, then the above relation follows from the law of total expectation. And it appears in general, not just to Martingale processes.

Here's my attempt at a proof:

We know

$$ E[X] = E[E[X|Y]] $$

We can take the expected value of the Martingale property as follows: $$ E[E[X_{n+1}\mid X_1,\ldots,X_{n}]] = E[X_{n}] $$ From the law of total expectation, we know that the LHS gives $E[X_{n+1}]$ $$ \therefore E_{n+1} = E[X_n] $$

And similarly, you can do the same process to show $E[X_n] = E[X_{n-1}]$ and so on.

Is this correct? Or am I arriving at the correct answer by doing something wrong?

1

There are 1 best solutions below

7
On BEST ANSWER

If a sequence $(X_n)_{n\ge 1}$ of random variables has the property that for every $n\ge 1$, $E[X_{n+1}\mid X_1,\dots,X_n] = X_n$, then implicit in this assumption is that each $X_n$ is integrable. Then this is the definition of what it means for $(X_n)_{n\ge 1}$ to be a martingale.

To prove that for martingales, the expectation of $X_n$ is independent of $n$, we apply the definition$^\dagger$ of conditional expectation: $$ E[X_{n+1}] = E[E[X_{n+1}\mid X_1,\dots,X_n]] = E[X_n]. $$ Now, by induction if you like, or as $n\ge 1$ was arbitrary, the claim is proved.


$^\dagger$Recall that for an integrable random variable $X$, a $\sigma$-algebra $\mathscr F$, and any event $A\in\mathscr F$, the conditional expectation $E[X\mid\mathscr F]$ is an $\mathscr F$-measurable random variable that satisfies $E[X;A] = E[E[X \mid \mathscr F];A]$. Applying this with $A = \Omega$ we recover the usual law of total expectation: $E[X] = E[E[X\mid \mathscr F]]$. Note that if $A$ is an event, $E[X;A]$ is not the same as $E[X\mid A]$: the first is a number with the interpretation of being a weighted average of $X$ amongst inputs from $A$, and the second is shorthand for the random variable $E[X\mid\sigma(A)]$, with the interpretation as the best guess of $X$ given the knowledge of whether $A$ has happened.