Proof that conditional expectation is defined uniquely almost everywhere?

1.1k Views Asked by At

Consider a probability space $(\Omega, \mathcal B, \mathbb P)$, and 2 random variables $X,Y:\Omega\to\mathbb R$

Conditional expectation $E(X|Y)=E(X|\sigma(Y))$ can be defined as a function that is

  1. $\sigma(Y)$ -measurable. I.e. there cannot be a subset $A$ of $\Omega$ on which $Y$ maps to a unique value, but where the conditional expectation does not.

  2. $E(E(X|Y)\cdot 1_A)=E(X\cdot 1_A)$ For all $A\in \sigma(Y)$.

How do we prove that this defines a function that is unique almost-everywhere?

1

There are 1 best solutions below

3
On BEST ANSWER

For uniqueness: Suppose $Z, Z'$ are two functions (random variables) that satisfy 1 and 2. Set $A = \{\omega: Z(\omega) \ge Z'(\omega)\}$. Since $Z,Z'$ are both $\sigma(Y)$ measurable, the same is true of $A$. Hence $E[Z 1_A] = E[X 1_A] = E[Z' 1_A]$ by 2. Subtracting, we have $E[(Z-Z') 1_A] = 0$. But $(Z-Z') 1_A$ is nonnegative everywhere, so we conclude $(Z-Z') 1_A = 0$ almost everywhere. That is to say that $Z \le Z'$ almost everywhere. By symmetry, we also get $Z' \le Z$ almost everywhere, so that $Z=Z'$ almost everywhere.

Existence is more difficult. The proofs I've seen either use the Radon-Nikodym theorem, or the Riesz representation theorem in Hilbert space. Any measure-theoretic probability book will have a proof.