How would you show that $E[g(X,Y)|Y=y] = E[g(X,y)|Y=y]$ in the continuous case? This problem is in Probability Models by Ross. No solution is provided, and no information is given about the random variables, altough I suppose we could assume the functions involved are absolute convergent or something.
The penultimate step in the discrete case is that all $p(X=i, Y=j|Y=y)$ are $0$ unless $j = y$, which I can't translate into the continuous case. My real analysis skills are non-existent, but you aren't supposed to know that for this book either way.
I tried to prove it in the simplified case g(X,Y) = XY. (Part a in Ross' book)
Let Z = XY. Then $$E[Z|Y=y] = \int z \cdot f_{Z|Y}(z|y)dz$$ $$= y \int x \cdot f_{Z|Y}(z|y)dx$$ through change of variable. And $$ f_{Z|Y}(z|y) = f_{XY|Y}(xy|y)$$ $$ = \frac{f(XY=xy, Y=y)}{f(Y=y)}$$ $$=\frac{f(X=x, Y=y)}{f(Y=y)}$$ which gives the desired conclusion in this case.
Is this any more rigorous than the comment made by Peter Morfe in the comment above? To convince ourselves about the last step, I suppose we could look at the cumulative distribution functions and take derivatives.