I'm trying to prove that, if $X, Y$ random variables and $r(x), s(x)$ functions then
$$\mathbb{E}(r(X)s(Y) | X) = r(X)\mathbb{E}(s(Y)|X)$$
What I do know is
$$\mathbb{E}(r(X)s(Y)|X) = \int_{-\infty}^{\infty}r(x)s(y)\frac{P_{X,Y}(x,y)}{P_{Y}(y)}dx$$
However, I'm not even sure how to write the right-hand side because it seems like it should be a random variable, not a function of like the function above. I would say
$$\mathbb{E}(s(Y)|X) = \int_{-\infty}^{\infty}s(y)\frac{P_{X,Y}(x,y)}{P_{Y}(y)}dx$$
but I don't understand what it could mean that this times $r(X)$ gets me the other side of the equation, which as I understand it, is a function of $y$ and not some random variable. Even if it is a random variable I'm not sure how I could show they were equal since I can't imagine how you'd use the distribution technique or find a moment generating function.
I may have figured this out. The expressions on each side of the equation are not functions of $x$ or any such thing, but instead are random variables that are basically a transformation of $X$. The conditional expectation $E(r(X)s(Y)|X)$ could take different values for different $X=x$, just as $X^{2}$ takes different values for different $X=x$, although the associated probability function would be the same.
So if I can show that the left- and right-hand sides take the same values for $X=x$, then since they both are associated with the same probability distribution for $X$, it will have shown that they are the same random variable. So to do this, I pick any particular $X=x$ and for this we get
$$\mathbb{E}(r(X=x)s(Y)|X=x) = \int_{-\infty}^{\infty}r(x)s(y)P(Y|X=x)dy$$
$$=r(x)\int_{-\infty}^{\infty}s(y)P(Y|X=x)dy = r(x)E(Y|X=x)$$
and the right-hand side is just the value of the random variable $r(X)E(Y|X)$ for every such $x$.