Let $(\Omega, \mathcal{A}, P)$ be a probability space and consider a square-integrable random variable $X$ and a sub-sigma-Algebra $\mathcal{F}$.
Denote $Z:=E[X \mid \mathcal{F}]$.
Conditional expectation is the unique minimizer (orthogonal projection), meaning that
$E[(X-Z')^2] \geq E[(X-Z)^2]$ for all $Z' \in L^2(\Omega, \mathcal{F},P)$.
Now we have seen the proof of this, deriving:
$E[(X-Z')^2] \geq E[(X-Z)^2] + E[(Z-Z')^2]$
Now it is written that we have equality iff $Z=Z'$. But why? Couldn't it be that $E[(Z-Z')^2]=0$ but $Z\neq Z'$?
Since $(Z-Z')^2 \geq 0$, its expectation is zero if and only if $Z=Z'$ with probability $1$. We sometimes just write this as $Z=Z'$; in particular, $Z$ and $Z'$ are indeed equal if you identify them with their $L^2$ (or any $L^p$) equivalence classes.