Conditional expectation with countably many disjoint events

160 Views Asked by At

In his probability book Bauer gives the following prelude to the definition of conditional expectation:

Let $X$ be an integrable real random variable on a probability space $(\Omega,\mathcal{A},P)$, and let $\Omega=\bigcup_{i\in \mathbb{N}} B_i$ be a partition of $\Omega$ into countably many pairwise disjoint events $B_i\in \mathcal{A}$ having positive probability $P(B_i)>0$. Define the conditional expectation of $X$ given $B_i$ by

$$E_{B_i}(X):=\frac{1}{P(B_i)}\int_{B_i}XdP.$$

This leads to a new random variable

$$X_0:=\sum_{i\in \mathbb{N}}E_{B_i}(X)1_{B_i}.$$

This random variable is measurable with respect to the sub-$\sigma$-algebra $\mathcal{C}:=\sigma(B_i:i\in\mathbb{N})$, which consists of all sets of the form $\bigcup_{i\in J} B_i$ with $J\subset\mathbb{N}$. Now

$$\int_{B_i}X_0 dP=E_{B_i}(X) P(B_i)=\int_{B_i}X dP \hspace{1cm} i\in \mathbb{N},$$

and because of the special form of $\mathcal{C}$ it follows that

$$\int_{C}X_0 dP=\int_{C}X dP \hspace{1cm} \text{for all } C\in \mathcal{C}.$$

Question: Why does this last equality hold?

My reasoning is the following:

We see that $E_{B_i}(X)=E_{B_i}(X^+)-E_{B_i}(X^-)$, and so $X_0:=\sum_{i\in \mathbb{N}}E_{B_i}(X^+)1_{B_i}-\sum_{i\in \mathbb{N}}E_{B_i}(X^-)1_{B_i}.$ Now since

$$\sum_{i=1}^n E_{B_i}(X^+)1_{B_i} \uparrow \sum_{i\in \mathbb{N}}E_{B_i}(X^+)1_{B_i} \hspace{0.5cm} \text{pointwise as } n\to \infty$$

the MCT implies that

$$\int \sum_{i\in \mathbb{N}}E_{B_i}(X^+)1_{B_i} dP = \lim_{n\to \infty} \sum_{i=1}^n E_{B_i}(X^+)P(B_i)$$

But $\sum_{i=1}^n E_{B_i}(X^+)P(B_i)=\sum_{i=1}^n \int_{B_i}X^+ dP$ and $\sum_{i=1}^n X^+ 1_{B_i} \uparrow X^+$ pointwise as $n\to \infty$. Therefore another application of the MCT delivers the relation

$$\int \sum_{i\in \mathbb{N}}E_{B_i}(X^+)1_{B_i} dP=\int X^+ dP < \infty$$

The same argument with $X^-$ also gives

$$\int \sum_{i\in \mathbb{N}}E_{B_i}(X^-)1_{B_i} dP=\int X^- dP < \infty$$

Putting everything together we get that $X_0$ is integrable and $\int X_0 dP=\int X dP$.

This covers the case $C=\bigcup_{i\in J} B_i=\Omega$. If $J\neq \mathbb{N}$ is infinite then $X_0 1_C=\sum_{i\in J}E_{B_i}(X)1_{B_i}$ and we see that the same argument gives $\int_C X_0 dP=\int_C X dP$. Finally if $J$ is finite then the result is immediate.

Is this correct?

1

There are 1 best solutions below

2
On BEST ANSWER

If $C=\bigcup_{i\in J}B_i$ for some $J\subseteq\mathbb N$ then for nonnegative $Y$ we find:$$\int_CYdP=\sum_{i\in J}\int_{B_i}YdP=\sum_{i\in J}\int_{B_i}Y_0dP=\int_CY_0dP$$

This can be used to find that: $$\int_CX^+dP=\int_C{X_0}^+dP\text{ and }\int_CX^-dP=\int_C{X_0}^-dP$$

From this we conclude that:$$\int_CXdP=\int_CX_0dP$$