Let $X_n$ be the $n$-th partial sum of i.i.d. centralized rv and $\mathcal{F}_m:=\sigma(X_n,n\le m)$, then $\text{E}[X_n\mid\mathcal{F}_m]=X_m$

204 Views Asked by At

Let

  • $(\Omega,\mathcal{F},\text{P})$ be a probability space
  • $\left(Y_i\right)_{i\in\mathbb{N}}$ be a sequence of i.i.d. random variables $(\Omega,\mathcal{F})\to\left(\mathbb{R},\mathcal{B}\left(\mathbb{R}\right)\right)$ with $\operatorname{E}[Y_i]=0$ and $$X_n:=Y_1+\cdots Y_n$$
  • $\mathcal{F}_m:=\sigma\left(X_n,n\le m\right)$ be the smallest $\sigma$-Algebra such that $X_1,\ldots,X_m$ are measurable with respect to $\mathcal{F}_m$
  • $\operatorname{E}\left[X_n\mid\mathcal{F}_m\right]$ denote the conditional expectation of $X_n$ given $\mathcal{F}_m$

Maybe it's cause there are too many new concepts for me (conditional expectation, filtrations, ...), but I don't understand why we've got $$\operatorname{E}\left[X_n\mid\mathcal{F}_m\right]=X_m\;\;\;\text{for all }m<n$$

1

There are 1 best solutions below

4
On BEST ANSWER

As @aerdna91 pointed out, the identity

$$\mathbb{E}(X_n \mid \mathcal{F}_m) = X_m \tag{1}$$

holds only for $m \leq n$. For $m>n$, we have

$$\mathbb{E}(X_n \mid \mathcal{F}_m) = X_n.$$

To prove $(1)$, we consider the case $m=n-1$. Then, as $X_n = X_{n-1} +Y_n$,

$$\mathbb{E}(X_n \mid \mathcal{F}_{n-1}) = \underbrace{\mathbb{E}(X_{n-1} \mid \mathcal{F}_{n-1})}_{X_{n-1}}+ \mathbb{E}(Y_n \mid \mathcal{F}_{n-1}).$$

Now, since $\mathcal{F}_{n-1}$ and $Y_n$ are independent, the second term equals

$$\mathbb{E}(Y_n \mid \mathcal{F}_{n-1}) = \mathbb{E}(Y_n)=0.$$

Hence, we have shown that

$$\mathbb{E}(X_n \mid \mathcal{F}_{n-1}) = X_{n-1} \tag{2}.$$

Now $(1)$ follows by iterating this procedure.

Remark: The proof shows that $(X_n,\mathcal{F}_n)_{n \in \mathbb{N}}$ is a martingale.