Let $X$ and $Y$ be two independent (real - valued) random variables , both defined on the probability space $(\Omega ,A,P)$
(a) $E|X| < \infty $ , $E|Y| < \infty $ . Let $g(.):R \to R$ (set of real number) be $g(x) = x + E[Y]$. Show that $E[X + Y \mid X] = g(X)$.
Solution : $E[X + Y \mid X] = E[X \mid X] + E[Y \mid X] = X + E[Y] = g(X)$
(b) The above situation in general. Let $\phi $ be a function such that $E|\phi (X,Y)| < \infty $ . Let $g(x) = E|\phi (x,Y)|$. Use definition of conditional expectation show that $E[\phi (X,Y) \mid X] = g(X)$
i.e. show that
(i) $g(X)$ is measurable with respect to the sub-$\sigma $-field $\sigma (X) = \{ {X^{ - 1}}(B),B \in \mathcal{B}\} $ , where $\mathcal{B}$ is the borel $\sigma $-fileld of $R$
Solution : Since $g(X)$ is measurable with respect to $\sigma (g(X))$ and $\sigma (g(X)) \subset \sigma (X)$. Therefore $g(X)$ is measurable with respect to $\sigma (X)$.
(ii) For any $A \in \sigma (X)$ we have $\int\limits_A {\phi (X,Y)dp} = \int\limits_A {g(X)dp} $
Question : 1) Part(a) and Part (b) (i) it true or false ?
2) How I can Show part (b) (ii) ?
Thank you.
Yes, you are right about (a) and (b)(i) (however, for the second part, a somewhat more detailed reasoning why the relation "$\sigma(g(X)) \subseteq \sigma(X)$" holds would be nice.)
Concerning (b)(ii): There are (at) least two possibilities to prove this.
Hints (Solution I):
Hints (Solution II):