X and Y identically distributed implies measurability of X and Y (especially Y=$\mathbb{E}(X|\mathcal{G})$

107 Views Asked by At

X and Y random variables on ($\Omega$, P, $\mathcal{F}$). If Y is measurable regarding a sub-$\sigma$-algebra $\mathcal{G}$ and X and Y are identically distributed, is there a way to show that X is $\mathcal{G}$-measurable, too?

Edit: I need to show that: For X$\in\mathcal{L}^2$:
$\mathbb{E}$(X|$\mathcal{G}$)$=^d$X$\Rightarrow\mathbb{E}(X|\mathcal{G})=X$ a.e. so for Y=$\mathbb{E}(X|\mathcal{G})$ I already showed that $\mathbb{E}(Y1_{G})=\mathbb{E}(X1_{G})$ for G$\in\mathcal{G}$ and I supposed that I could show the second property for conditional expectation directly by using the identical distribution.

2

There are 2 best solutions below

0
On BEST ANSWER

Suppose $X$ and $E(X|\mathcal G)$ have the same distribution. We prove that $X=E(X|\mathcal G)$ almost surely. [ This is probably what the OP wanted to ask (as pointed out by user Did). Let $Y=X^{+}$. We claim that Y and E(Y|G) have the same distribution. Indeed, $Y$ has the same distribution as $(E(X|G))^{+}$ and $(E(X|G))^{+}≤E(Y|G)$ with both sides having the same expectation. Hence $(E(X|G))^{+}=E(Y|G)$ a.s. which implies that $Y$ and $E(Y|G)$ have the same distribution. Since $-X$ and $E(-X|G)$ have the same distribution we see (by repeating the just concluded argument with $X$ replaced by $-X$) that $(-X)^{+}$ and $E((-X)^{+}|G)$ have the same distribution. This means $(X)^{-}$ and $E((X)^{-}|G)$ have the same distribution. If we prove the result for non-negative random variables we conclude that $X^{+}=E(X^{+}|G)$ a.s.. and $X^{-}=E(X^{-}|G)$ a.s.. and hence that $X=E(X|G)$ a.s.. From now on we assume that $X≥0$. Let $N$ be a positive integer and $X_{N}=\min \{X,N\}$. We claim that $X_{N}$ and $E(X_{N}|G)$ have the same distribution. For this note that $E(X_{N}|G)≤\min \{E(X|G),N\}$ and both sides have the same expectation(by hypothesis) so equality holds a.s.. Since $\min \{E(X|G),N\}$ has the same distribution as $\min \{X,N\}=X_{N}$ we see that $X_{N}$ and $E(X_{N}|G)$ have the same distribution. If we prove the result for non-negative bounded random variables we can conclude that $E(X_{N}|G)=X_{N}$ a.s.. This is true for each $N$ and we get $E(X|G)=X$ a.s. in the limit. The proof has been reduced to the case when $X$ is non-negative and bounded. Assume now that $X$ is positive and bounded. In this case $(E(X|G))^{2}≤E(X^{²}|G)$ and both sides have the same expectation (by hypothesis again). Hence $(E(X|G))^{2}=E(X²|G)$ a.s.. This gives $E(X-E(X|G))^{2}=EX^{2}+EX^{2}-2E{XE(X|G)}=2EX^{2}-2E{E(X|G)}^{2}=0$ and so $X=E(X|G)$ a.s.

6
On

No, this is not true. For a simple example, suppose that $X$ and $Y$ only take the values $0$ and $1$, with $A=\{\omega:X(\omega)=1\}$ and $B=\{\omega:Y(\omega)=1\}$. Then $X$ and $Y$ are identically distributed as long as $P(A)=P(B)$. But $Y$ is measurable with respect to the $\sigma$-algebra $\mathcal{G}=\{\Omega,\emptyset,B,\Omega\setminus B\}$, and $X$ will not be unless $A$ happens to be in $\mathcal{G}$.