The order of conditional expectation

485 Views Asked by At

I've started this problem by defining two sigma algebras, one consisting of $F=(0, a, a^c, \Omega)$ and $G=(0,b,b^c,\Omega)$. The question I have that if I condition $E[1_a(w)\mid G]$, is $1_a(w)$ independent of $G$ then? ($1$ is indicator).

The problem:

Problem

Anyone has a solution to this problem?

2

There are 2 best solutions below

1
On BEST ANSWER

Re the problem:

Everything works smoothly, really... Let us check that $\mathcal F=\sigma(\{b\})$, $\mathcal G=\sigma(\{c\})$, and $X=\mathbf 1_{\{a\}}$ is a "concrete example", most of the time.

To simplify the proof, introduce the (symmetrized) notations $$x=P(\{a\}),\qquad y=P(\{b\}),\qquad z=P(\{c\}),\qquad Y=\mathbf 1_{\{b\}},\qquad Z=\mathbf 1_{\{c\}},$$ and assume that $xyz\ne0$. Then $x+y+z=1$, $X+Y+Z=1$, and $$E(X\mid \mathcal F)=\frac{x}{1-y}(1-Y),\qquad E(Y\mid \mathcal G)=\frac{y}{1-z}(1-Z),$$ hence $$E(E(X\mid \mathcal F)\mid \mathcal G)=\frac{x}{1-y}\left(1-\frac{y}{1-z}(1-Z)\right)=\frac{x\,(x+yZ)}{(1-y)(1-z)}.$$ The RHS is symmetric with respect to the symmetry $(Y,y)\leftrightarrow(Z,z)$ if and only if $yZ=zY$, which never happens unless $yz=0$.

Re the question:

The question I have that if I condition $E[1_a(w)\mid G]$, is $1_a(w)$ independent of $G$ then?

Well... This is difficult to parse, to say the least.

First, if $w$ is in $\Omega$, then $1_a(w)$ is a number, not a random variable, hence $E[1_a(w)\mid G]=1_a(w)$, trivially, and $1_a(w)$ is independent of $G$, trivially, hence the answer would be obvious. But what you have in mind is probably rather the question:

If one considers $E[1_a\mid G]$, is $1_a$ independent of $G$ then?

The trouble now is that the preliminary part "if one considers $E[1_a\mid G]$" is unrelated to the question part "is $1_a$ independent of $G$?". The random variable $1_a$ is independent of $G$ or not independent of $G$ whether one "considers" $E[1_a\mid G]$ or not. Thus, we seem to be left with the question:

Is $1_a$ independent of $G$?

...Whose best answer is that this depends on the hypotheses one makes. No question here. (By contrast, the problem copied from the book, to which we answered above, makes perfect sense.)

3
On

If $X$ is independent of $G$, then $\mathbb{E}[X|G]=\mathbb{E}[X]$. In other words, having information on $G$ gives no information on $X$. You can check this by definition of conditional expectation. Clearly $\mathbb{E}[X]$ is $G$ measurable. Furthemore by independence, for any $A\in G$, $P(X,A)=P(X)P(A)$. So $\int_A XdP=\int_\Omega 1_AXdP=\int_\Omega 1_AdP \int_\Omega XdP= P(A)\mathbb{E}[X]$. On the other hand, $\int_A E[X]dP=\int_\Omega 1_A E[X]dP=P(A)E[X]$, satisfying the definition of conditional expectation.