Consider a probability space $(\Omega, \mathcal B, \mathbb P)$, and 2 random variables $X,Y:\Omega\to\mathbb R$
Conditional expectation $E(X|Y)=E(X|\sigma(Y))$ can be defined as a function that is
$\sigma(Y)$ -measurable. I.e. there cannot be a subset $A$ of $\Omega$ on which $Y$ maps to a unique value, but where the conditional expectation does not.
$E(E(X|Y)\cdot 1_A)=E(X\cdot 1_A)$ For all $A\in \sigma(Y)$.
How do we prove that this defines a function that is unique almost-everywhere?
For uniqueness: Suppose $Z, Z'$ are two functions (random variables) that satisfy 1 and 2. Set $A = \{\omega: Z(\omega) \ge Z'(\omega)\}$. Since $Z,Z'$ are both $\sigma(Y)$ measurable, the same is true of $A$. Hence $E[Z 1_A] = E[X 1_A] = E[Z' 1_A]$ by 2. Subtracting, we have $E[(Z-Z') 1_A] = 0$. But $(Z-Z') 1_A$ is nonnegative everywhere, so we conclude $(Z-Z') 1_A = 0$ almost everywhere. That is to say that $Z \le Z'$ almost everywhere. By symmetry, we also get $Z' \le Z$ almost everywhere, so that $Z=Z'$ almost everywhere.
Existence is more difficult. The proofs I've seen either use the Radon-Nikodym theorem, or the Riesz representation theorem in Hilbert space. Any measure-theoretic probability book will have a proof.