Random variable related by conditional expectations

60 Views Asked by At

Let X and Y be random variables such that $E(X|Y)=\frac Y 2$ and $E(Y|X)=\frac X 2$. Does it follow that X and Y are 0? If not is their a simple example of such random variables? Motivation: if $E(X|Y)= Y $ and $E(X|Y)=Y$ then X=Y necessarily. This is easy to prove: if X>0 and Y>0 we can write $E(\frac X Y +\frac Y X)=E(\frac X Y) +E(\frac Y X)=1+1=2$ and $x+\frac 1 x \geq 2$ with equality if and only if $x=1$. For the general case we can use $X^{+}+1$ and $Y^{+}+1$ in place of X and Y to get $X^{+}=Y^{+}$ and a similar argument for $X^{-}$ and $Y^{-}$.

2

There are 2 best solutions below

0
On BEST ANSWER

Let $A, B, C$ be independent with \begin{align*} &P[B=0]=P[B=1]=1/2\\ &P[C=0]=P[C=1]=1/2\\ &P[A=-1]=P[A=1]=1/2 \end{align*} Define: \begin{align} X &= AB\\ Y &= AC \end{align}

Then \begin{align} &E[X|Y=1] = E[AB|AC=1]=E[AB|A=1, C=1]=1/2\\ &E[X|Y=0] = E[AB|C=0] = E[A]E[B]=0 \\ &E[X|Y=-1] = E[AB|AC=-1] = E[AB|A=-1, C=1] =-1/2 \end{align} So $E[X|Y]=Y/2$. By symmetry we also get $E[Y|X]=X/2$.

2
On

Here is a generalized result related to the motivating example: Suppose random variables $X$ and $Y$ satisfy $E[|X|]<\infty$, $E[|Y|]<\infty$, and $$ E[X|Y]\leq Y, E[Y|X]\leq X $$ Then $X=Y$ with probability 1.

Proof: Fix $M>0$ as a (large) integer. Define truncated random variables: \begin{align} A_M = \left\{ \begin{array}{ll} X &\mbox{ if $X \geq -M$} \\ -M & \mbox{ otherwise} \end{array} \right.\\ B_M = \left\{ \begin{array}{ll} Y &\mbox{ if $Y \geq -M$} \\ -M & \mbox{ otherwise} \end{array} \right.\\ \end{align} Then $$ \lim_{M\rightarrow\infty} P[X\neq A_M] = \lim_{M\rightarrow\infty} P[Y\neq B_M] = 0$$ Because of this, it can be shown that for any random variable $Z$ that satisfies $E[|Z|]<\infty$ we have $$ \lim_{M\rightarrow\infty} E[Z1_{\{X \neq A_M\}}] = \lim_{M\rightarrow\infty} E[Z1_{\{Y\neq B_M\}}] = 0 \quad (**) $$ Define $c = M+1$. So $A_M+c\geq 1$ and $B_M+c \geq 1$ and we can apply an argument similar to that suggested by Kavi, \begin{align} E[(A_M+c)/(B_M+c)] &= E[E[(A_M+c)/(B_M+c)|Y]]\\ &= E[1/(B_M+c) E[(A_M+c)|Y]]\\ &= E[1/(B_M+c) E[(X + c) + (A_M-X)|Y]]\\ &\leq E[1/(B_M+c)(Y+c + E[A_M-X|Y])] \\ &= E[1/(B_M+c)(B_M+c + (Y-B_M) + E[A_M-X|Y])] \\ &=E\left[1 + \frac{Y-B_M}{B_M+c} + \frac{A_M-X}{B_M+c} \right]\\ &\leq 1 + E[|Y-B_M|] + E[|A_M-X|] \end{align} By symmetry we also get $$ E[(B_M+c)/(A_M+c)] \leq 1 + E[|Y-B_M|] + E[|A_M-X|] $$ Define $f(M) = E[|Y- B_M|] + E[|A_M-X|]$. By fact (**), it can be shown that $f(M)\rightarrow 0$. Thus $$ E[(B_M+c)/(A_M+c)] + E[(A_M+c)/(B_M+c)] \leq 2 + 2f(M) \rightarrow 2$$ On the other hand, for all $M$ and all realizations of the random variables we have $$ (B_M+c)/(A_M+c) + (A_M+c)/(B_M+c)\geq 2 $$ with "near equality" only when $A_M+c \approx B_M+c$. Thus, for any $\epsilon>0$ we get: $$ \lim_{M\rightarrow\infty} P[|B_M-A_M|>\epsilon] =0 $$ However, $$ P[|X-Y|>\epsilon] \leq P[X \neq A_M] + P[Y\neq B_M] + P[|A_M-B_M|>\epsilon] $$ Taking a limit as $M\rightarrow\infty$ gives $$ P[|X-Y|>\epsilon] = 0$$ This holds for all $\epsilon>0$ and so $P[X=Y]=1$. $\Box$