Conditional expectation of a bounded almost sure random variable

192 Views Asked by At

Let $Z$ a random variable which is $\mathscr{p}$-measurable and bounded almost sure, ie, there is a positive number $M$ such that $|Z| \leq M $ a.s.Then show thatfor the conditional expectation $$Y = \mathbb{E}[X|\mathscr{p}]$$ we have : $$\mathbb{E}[YZ] = \mathbb{E}[XZ]$$

and more general if $Z$ is $\mathscr{p}$-measurable and $$\mathbb{E}[|ZX|]< \infty$$ then the equality to be proven is still true.

2

There are 2 best solutions below

0
On

For $Z$ that is an indicator this is the definition of conditional expectation. For a general bounded $Z$ you can approximate it uniformly by a linear combination of indicators. Any non-negative $Z$ is a monotone increasing limit of bounded variables. Finally for a general $Z$ (with $|ZX|$ integrable) express it as a difference of two non-negative variables (with the same property.) It might help to first reduce to the case where $X \ge 0$.

0
On

We know that for integrable r.v.'s $X$, the defining equation for its conditional expectation $Y=\mathbb{E}[X|\mathscr{G}]$ given some sub-$\sigma$-field $\mathscr{G}$ is $$\mathbb{E}[X\cdot1_A]=\mathbb{E}[Y\cdot1_A],\ \forall A\in\mathscr{G}\quad(1)$$

Changing to: $$\mathbb{E}[XZ]=\mathbb{E}[YZ],\ \forall\text{ nonnegative, bounded,} \mathscr{G}\text{-measurable r.v. }Z\quad(2)$$

Now $(2)\implies(1)$ is obvious. For the other direction. My proof is as follows:

Since $Z$ is $\mathscr{G}$-measurable, nonnegative and bounded, we can find an increasing sequence of simple functions $z_n\geq0$ such that $z_n\nearrow Z$ a.s. Moreover, $z_n$ has the form $$z_n=\sum_{j=1}^{nk}a_j\cdot1_{A_{kj}},\text{ where}\ A_{kj}\in\mathscr{G}$$ Clearly, $\mathbb{E}[X z_k]=\mathbb{E}[Y z_k]$ by $(1)$. Let $k\to\infty$ and by dominated convergence we have $\mathbb{E}[XZ]=\mathbb{E}[YZ]$.