Conditional expectation of random function

551 Views Asked by At

Let $f:\mathbb{R}\times\Omega\to\mathbb{R}$ be a bounded measurable random function. Assume that for each $x$ the random variable $f(x,\cdot)$ is independent of a $\sigma$-algebra $\mathcal G$. Let $X$ be a $\mathcal G$-measurable random variable.

How can we prove that $$ \mathsf E (f(X,\omega)|\mathcal{G})=\mathsf E f(x,\omega)|_{x=X(\omega)} $$

Does someone maybe know a specific reference to a book where this is proven?

Thanks!

1

There are 1 best solutions below

7
On BEST ANSWER

Consider the probability space $$[\Omega,\mathcal A, P]$$ and let $\mathcal G\subset \mathcal A$ be a $\sigma$-algebra.

Let $f$ be a simple function for any fixed $x$, that is, let $$f(x,\omega)=f_i(x)\mathbb I_{ A_i}(\omega), \ i=1,2,\cdots,n$$ where $\mathbb I_{A_i}$ is the indicator function of $A_i$ and the sets $A_i$ are independent of $\mathcal G$. Furthermore, let the functions $f_i:\mathbb{R}\to\mathbb{R}$ be Lebesgue measurable.

So, $$E[f(X,\cdot) \mid \mathcal G]=\sum_{i=1}^nE[f_i(X)\mathbb I_{A_i}\mid\mathcal G]=$$ $$=\sum_{i=1}^nf_i(X)E[\mathbb I_{A_i}\mid \mathcal G]= \sum_{i=1}^nf_i(X)E[\mathbb I_{A_i}]=\sum_{i=1}^nf_i(X)P(A_i)=$$ $$=\left[\sum_{i=1}^nf_i(x)P(A_i)\right] _{X=x}=E[f(x,\cdot)]_{X=x}.$$ because the $f_i(X)$s are $\mathcal G$ measurable and the $\mathbb I_{A_i}$s are independent of $\mathcal G.$

Finally, consider a function $f:\mathbb{R}\times\Omega\to\mathbb{R}$ such that for any $x\in\mathbb R$ $f(x,\cdot)$ is a random variable, that is a measurable function of $\omega$.

Theorem$^1$ Every (extended) real valued measurable function $f$ is the limit of simple functions (in the pointwise sense).

Now, for an $x$ consider the the random variable (measurable function) $f(x, \cdot)$ and a suitable sequence of simple functions $f^{(n)}(x,\cdot)$.

EDIT begins

In what follows, it will not be enough to assume that $f(x,\omega)$ is measurable for any $x$ because we will have to consider $f(X(\omega),\omega)$ as a random variable, a measurable function of $\omega$. For the purpose of this discussion, let's simply demand that $f(X(\omega),\omega)$ is measurable. Gikhman-Skorokhod$^2$ defines the measurability of a real valued $f:\mathbb{R}\times\Omega\to\mathbb{R}$ by characterizing $f$ independently of any specific $X$.

EDIT ends

We've proved that for a simple function, the claim of the OP holds. Let's see now a general $f$ of the kind described above. So, for any $x$

$$\lim_{n\to\infty} f^{(n)}(x,\omega)=f(x,\omega).$$

Now,

$$E[f(X,\cdot)\mid \mathcal G]=E[\lim_{n\to \infty}f^{(n)}(X,\cdot)\mid \mathcal G]=\lim_{n\to \infty}E[f^{(n)}(X,\cdot)\mid\mathcal G]=$$ $$=\lim_{n\to \infty}\left[E[f^{(n)}(x,\cdot)]\right]_{x=X}=E[\lim_{n\to \infty}f^{(n)}(x,\cdot)]_{x=X}=E[f(x,\cdot)]_{x=X}.$$

Regarding the switching the expectation sign and the limit sign, see the dominated convergence theorem and recall that $f$ is bounded.

References

Let $Y$ be a $\mathcal G$ measurable random variable and let $Z$ be an arbitrary one then

$$E[YZ \mid \mathcal G]=YE[Z \mid \mathcal G].$$

(Assuming that the respective expectations exist.)

See: Gikhman-Skorokhod, The Theory of Stochastic Processes, Ch. 3.6. Theorem 2.

Let $Y$ be independent from the $\sigma-$ algebra $\mathcal G$ then

$$E[Y\mid \mathcal G]=E[Y].$$

(Assuming that the respective expectations exist.)

Gikhman-Skorokhod citref, Ch. 3.6. Th. 4.

Ad $^1$: Halmos, Measure Theory, § 20. Th. B

Ad $^2$: Gikhman-Skorokhod citref, 4.2., 4.3.