Let $X$ and $Y$ be two real-valued random variables. The conditional expectation of X condion on $Y$ taking value $y$ is defined as
$$ E[X | Y=y] = \int x \, p_{X | Y}(x | y) dx. $$
Let $\sigma(Y)$ be the generated sigma algebra of $Y$.
Is it true that
$$ E[X | Y] \triangleq E[X | \sigma(Y)] = \int x\, p_{X | Y}(x | Y) dx $$ ?
Well intuitively seems yes. E[X | Y] is measurable w.r.t. $\sigma(Y)$, so there must be a function $f(Y) = E[X | Y] $. But can this $f(Y)$ be $\int p_{X | Y}(x | Y) dx $?
By definition, $\ E\big(X|\sigma(Y)\big)\ $ is a $\ \sigma(Y)$-measurable function which satisfies the identity $$ \int_AE\big(X|\sigma(Y)\big)\,dP=E\big(XI_A\big)\\ $$ for all $\ A\in\sigma(Y)\ $. By the Radon-Nikodym theorem it is uniquely defined $\ P$-almost everywhere. Now $\ A\in\sigma(Y)\ $ if and only if $\ A=Y^{-1}(B)\ $ for some measurable $\ B\subseteq\mathbb{R}\ $, so \begin{align} E\big(XI_A\big)&=E\big(XI_{Y^{-1}(B)}\big)\\ &=E\big(XI_B(Y)\big)\\ &=\int_B\int xp_{X|Y}(x|y)\,dx\,dF_Y(y)\\ &=\int_A\int xp_{X|Y}(x|Y)\,dx\,dP\ . \end{align} Therefore, by the uniqueness theorem, $$ E\big(X|\sigma(Y)\big)=\int xp_{X|Y}(x|Y)\,dx\ , $$ except possibly on a set of probability $0$.