I want to prove very simple equations, but am having some trouble because I have no clue.
Definition of Conditional Probability is given as follows:
Definition $E(Y|X)$ is a conditonal expectation of $Y$ given $X$ if it is a $\sigma (X)$-measurable random variable and for any Borel set $S \subseteq R$, we have $E(E(Y|X)1_{X \in S})=E(Y1_{X \in S})$
I want to solve those two problems based on above definition.
- Suppose $X$ and $Y$ are independent.
(a) Prove that $E(Y|X)=E(Y)$ with probability 1.
(b) Prove that $Var(Y|X)=Var(Y)$ with probability 1.
(c) Explicitly verify the following theorem.(with $\mathcal{G}=\sigma (X)$ in this case)
Theorem: Let $Y$ be a random variable, and $\mathcal{G}$ a sub-$\sigma$-algebra. If $Var(Y)<\infty$, then $Var(Y)=E(Var(Y|\mathcal{G}))+Var(E(Y|\mathcal{G}))$
- Let X and Y be jointly defined random variables.
(a) Suppose $E(Y|X)=E(Y)$ with probability 1. Prove that $E(XY)=E(X)E(Y)$
(b) Give an example where $E(XY)=E(X)E(Y)$, but it is NOT the case that $E(Y|X)=E(Y)$ with probability 1.
1.
For this part, I think I could deal with this by myself if I understand how to prove (a). So please help me with just (a). In order to prove that $E(Y|X)=E(Y)$ with probability 1, I need to show that
\begin{equation} E(E(Y|X)1_{X})=E(E(Y)1_{X}). \end{equation}
if $X$ and $Y$ are independent.
From the above definition, conditional expectation $E(Y|X)$ is defined such that $E(E(Y|X)1_{X \in S})=E(Y1_{X \in S})$.
So maybe it suffices to show that $E(Y1_{X \in S})=E(E(Y)1_{X \in S})$. But it looks too obvious. My guess is that there is more educated way of showing why $E(Y1_{X \in S})=E(E(Y)1_{X \in S})$ holds if $X$ and $Y$ are independent.
2.
I can do (b) on my own. So for (a), "$E(Y|X)=E(Y)$ with probability 1" means that $E(E(Y|X)1_{X})=E(E(Y)1_{X})$. But, how can I prove $E(XY)=E(X)E(Y)$ with that information? I know how to prove this equation with integral sign or summation sign. But how does information about a.s. convergence give any implication about that equation?
$\newcommand{\PM}{\mathbb{P}}\newcommand{\E}{\mathbb{E}}$I understand from your that you only need help with 1a en 2a. Furthermore you did not specify what is $R$ etc. But I think you mean what I think you mean. Furthermore the integrability of $Y$ is assumed by talking about the conditional expectation $Y$ given something. Let $(\Omega,\mathcal F,\PM)$ be the probability space to get started.
1.a) Let $A\in \sigma(X)$, then we have $\mathbf{1}_A$ and $Y$ are independent, because $\sigma(X)$ and $\sigma(Y)$ are independent by assumption. So: \begin{align} \int_A Y\,d\PM = \int_\Omega \mathbf{1}_AY\,d\PM= \E[\mathbf{1}_AY]=\E[\mathbf{1}_A]\E[Y]=\E[Y]\int_A\,d\PM=\int_A\E[Y]\,d\PM \end{align} So we have for all $A\in\sigma(X)$: \begin{align} \int_A\E[Y]\,d\PM =\int_A Y\,d\PM=\int_A\E[Y|X]\,d\PM \end{align} We know that $\E[Y]$ is just a number hence it is surely $\sigma(X)$ measurable. By the uniqueness of existence of a $\sigma(X)$-measurable $\E[Y|X]$ up to a null set we get $\E[Y]=\E[Y|X]$ a.s.
2.b) For this question it makes sense if we assume the integrability of $X$ and $XY$. Otherwise why would someone be interested in $\E[X]$...? Okay, now I hope you know the pullout property. That says: \begin{align} \E[XY|X]=X\E[Y|X] \end{align} On one hand we have: \begin{align} \E[\E[XY|X]]=\E[XY] \end{align} And on the other hand: \begin{align} \E[X\E[Y|X]]=\int_\Omega X\E[Y|X]\,d\PM=\int_\Omega X \E[Y]\,d\PM=\E[X]\E[Y] \end{align} So: \begin{align} \E[XY]=\E[X]\E[Y] \end{align} I hope you see where we have used $\E[Y|X]=\E[Y]$ a.s.