Question in the title:
For $X,Y $ random variables, $h $ a function, show that $E (Xh(Y)|Y)=h (Y)E (X|Y) $ almost surely
My main problem is that I don't even understand what $E (Xh(Y)|Y)$ means... I know what $E (Xh(Y)|Y=5)$ for example would mean, but without evaluating $Y $ I do not see how to give meaning to the expression. The only thing I could think of would be to consider $E (Xh(Y)|Y)$ as a new r.v. in the r.v. $Y $. Is this correct?
If $h:\mathbb R\to\mathbb R$ is Borel-measurable, then $h(Y)$ is measurable with respect to the $\sigma$-algebra generated by $Y$: $$\sigma(Y) = \{ Y^{-1}(B):B\in\mathcal B(\mathbb R)\}. $$ It follows from the definition of conditional expectation that $$\mathbb E[X h(Y)\mid Y] = h(Y)\mathbb E[X\mid Y]$$ with probability one (assuming that $\mathbb E[|Xh(Y)|]<\infty$ and $\mathbb E[|X|]<\infty$).