Showing that $\mathbb{E}[Xg(Y) \mid Y] = g(Y) \mathbb{E}[X\mid Y]$

76 Views Asked by At

I got stuck on this problem where I have to show that, for discret random variables $X$ and $Y$ and a function $f:\mathbb{R} \to \mathbb{R}$ we have \begin{align} \mathbb{E}[Xg(Y)\mid Y] = g(Y)\;\mathbb{E}[X\mid Y] \end{align} where the probability for every value in $\text{Im}(X)$ and $\text{Im}(Y)$ is always nonzero.

my attempt

\begin{align} \mathbb{E}[Xg(Y) | Y=y] &= \sum_{z \,\in \,\mathrm{Im}(Xg(Y))}z \:\mathbb{P}(Xg(Y) = z \mid Y=y)\\ &= \sum_{z \,\in \,\mathrm{Im}(Xg(Y))}z \:\frac{\mathbb{P}(Xg(Y)=z, Y=y)}{\mathbb{P}(Y=y)}\\ &= \sum_{z \,\in \,\mathrm{Im}(Xg(Y))}z \:\frac{\mathbb{P}(Xg(y)=z, Y=y)}{\mathbb{P}(Y=y)}\\ &\vdots \: (?)\\ &= g(y) \sum_{x\in \mathrm{Im}(X)} \mathbb{P}(X=x\mid Y=y) \end{align}

but I don't really know where to go from there.

1

There are 1 best solutions below

0
On

There are two competing definitions of the condition expectation and you haven't specified which one so I will assume you mean $$ E(X \mid A) = \frac{1}{P(A)} E(X \mathbf{1}_A), $$ where $A$ is an event of positive probability. You seem to be using this definition since you are dealing with discrete random variables.

Fix $y$ such that $Y = y$ has positive probability. Then $$ \begin{align*} E(Xg(Y) \mid Y = y) &= \frac{1}{P(Y = y)} E \left(Xg(Y) \mathbf{1}_{\{Y = y\}} \right) \\ &= g(y) \frac{E \left(X \mathbf{1}_{\{Y = y\}} \right)}{P(Y = y)} \\ &= g(y) E(X \mid Y = y), \end{align*} $$ which proves the desired result. (Note that we used that $g(Y) = g(y)$ on the event where $Y = y,$ and that this is a constant value, so it comes out of the expectation.)

Observation. In fact, we proved the result for any $X$ (not necessarily discrete) and on the atoms of any $Y,$ that is, the $y$ such that $P(Y = y)$ is positive.