I'm currently studying the properties of conditional expectation, and I came across this definition in A Probability Path (Resnick) that seems a bit confusing.
The statement is: If $X \in G; X \in L_1$ then $E[X|G]=X$ almost surely.
Proof: We prove this by merely noting that $X$ is $G-$measurable and $\int_{A}XdP=\int_{A}XdP$ $\forall A \in G$
My confusion:
Previously, when I studied almost sure convergence, the definition was presented as a sequence of random variables converging a.s to a limiting random variable with probability one. However, here I don't understand what the intuition is behind two random variables being equal a.s? I do not see a sequence ($X_n$) anywhere?
Also, I understand that conditional expectation is a Random-Nikodym derivative; therefore, by definition $\int_{A}E[X|G]dP=\int_{A}XdP$. However, I am confused about how this proof is enough to conclude a.s convergence.
Thanks!