I have encountered a theorem saying that:
$X$ and $Y$ are random variables (real valued, integrable) on $(\Omega, \mathcal A, P)$, $G(X,Y)$ is a random variable (also integrable and real valued), then
$$\mathbb E (G(X,Y)| Y = y) = \mathbb E (G(X,y)| Y = y), \forall y \in Y(\Omega) $$
I have attempted to prove it:
$$K := G(X,y), L := G(X,Y)$$
$y \in Y(\Omega)$ is chosen arbitrary but fixed.
$$\mathbb E(K|Y)\overset{P \text{-a.s.}}{=}\mathbb E (G(X,y)| Y = y) \circ Y := g_1(Y)$$
$$\mathbb E(L|Y)\overset{P \text{-a.s.}}{=}\mathbb E (G(X,Y)| Y = y) \circ Y: = g_2(Y)$$
$$\mathbb E(L|Y) - \mathbb E(K|Y) \overset{P \text{-a.s.}}{=} \mathbb E(L-K|Y)\overset{P \text{-a.s.}}{=}g_1(Y)-g_2(Y)$$
$A:=Y^{-1}(y) \in \mathcal \sigma(Y), K(\omega)-L(\omega) = 0, \forall \omega \in A$.
Using the definition of a conditional expectation:
$0=\int_{A} (L-K) dP=\int_A g_1(y) - g_2(y) d P = (g_1(y)-g_2(y)) P(A)$, so if $P(A) \ne 0$, then we have proved what was intended, but what if $P(A) = 0$?
Edit
I think I got it.
$E(K|Y)=g_1(Y)$ is only $P$-a.s. defined, so if $P(A)=0$ we can define a new function $g^{'}_{1}(Y(\omega)):=g_1(Y(\omega)), \omega \notin A, g^{'}_{1}(Y(\omega)):=g_2(Y(\omega)), \omega \in A$.
$g^{'}_{1}$ is also a version of the conditional expectation of $K$ w.r.t. $Y$ and by definition $g^{'}_{1}(y):=g_2(y)$.