Let $(X_i)_{i\in\mathbb{N}}$ be iid random variables with $\mathbb{E}|X_1|<\infty$ and let $S_n \stackrel{\rm{}def}{=} X_1+\cdots+X_n$ for all $n\in\mathbb{N}$. If $T$ is a stopping time with $\mathbb{E}\left[ T\right] < \infty$, show that $\mathbb{E}[S_T]=\mathbb{E}[X_1]\mathbb{E}[T]$.
2026-03-30 14:07:01.1774879621
On
Optional sampling
95 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
The proof uses the decomposition $S_T=\sum\limits_{n=1}^\infty X_n\mathbf 1_{T\geqslant n}$, the fact that, for each $n\geqslant1$, the random variable $\mathbf 1_{T\geqslant n}=1-\mathbf 1_{T\leqslant n-1}$ is $\sigma((X_k)_{1\leqslant k\leqslant n-1})$-measurable, and in particular, independent of $X_n$, hence $\mathbb E(X_n\mathbf 1_{T\geqslant n})=\mathbb E(X_n)\mathbb P(T\geqslant n)=\mathbb E(X_1)\mathbb P(T\geqslant n)$ for each $n\geqslant1$, and finally the fact that $\mathbb E(T)=\sum\limits_{n=1}^\infty \mathbb P(T\geqslant n)$ because $T=\sum\limits_{n=1}^\infty \mathbf 1_{T\geqslant n}$.
This is a result known as Wald's Equation; see for instance the Wikipedia page for a proof of the general version of the theorem, or Theorem 1.1 — and its proof — of these lecture notes for the tailored version you're referring to.