If $X, Y \sim f(x,y)$ are jointly absolutely-continuous and if $\mathcal{G} = \sigma(Y)$, then
$$E[X | Y] = \frac{\int xf(x, Y) dx}{\int f(x, Y) dx}.$$
I encountered the above in lecture notes on Conditional Expection as I was looking for resources for learning probability theory and was wondering if there was a proof of this. Any help would be appreciated!
Let $\nu$ denote the probability density function of $Y$, that is $$ \nu(y)=\int f(x,y)\,dx, $$ and let $$ p(x,y)=\frac{f(x,y)}{\nu(y)}1_{\{\nu(y)\neq0\}}\quad\textrm{and}\quad\varphi(y)=\int xp(x,y)\,dx. $$
Let $h$ be a nonnegative measurable map. Then $$ \begin{align*} E[Xh(Y)]&=\int xh(y)f(x,y)\,dx\,dy=\int xh(y)p(x,y)\nu(y)\,dx\,dy\\ &=\int h(y)\nu(y)\left(\int xp(x,y)\,dx\right)\,dy\\ &=\int h(y)\nu(y)\varphi(y)\,dy\\ &=E[\varphi(Y)h(Y)]. \end{align*} $$
Therefore, almost surely, $$ E[X\vert Y]=\varphi(Y)=\int xp(x,Y)\,dx=\frac{\int xf(x,Y)\,dx}{\int f(x,Y)\,dx}1_{\{\int f(x,Y)\,dx\neq0\}}. $$