I couldn't find any related questions that addressed my specific issue, so I'm hoping someone can help clarify for me.
Given two random variables, $Z$ and $W$, I have seen that $E[Z - E[Z|W]] = 0$, and I thought that the process to arrive there, using linearity of expectation followed by the law of iterated expectations is:
\begin{aligned} E[Z - E[Z|W]] &= E[Z] - E[E[Z|W]]\\ &= E[Z] - E[Z]\\ &= 0 \end{aligned} That makes sense to me, but I have also seen that if we condition the outer expectation on W as well, that $E[[Z - E[Z|W]]|W]$ should be $0$. This makes sense to me in the sense of 'if the expectation of the difference is $0$ regardless of whether $W$ is given, then it should be $0$ even when $W$ is given. But I am trying to work out the steps for this:
\begin{aligned} E[[Z - E[Z|W]]|W] &= E[Z|W] - E[E[Z|W]|W] \\ &= E[Z|W] - E[Z|W] \\ &= 0 \end{aligned}
Here I again used linearity of expectation (but now conditioned on W), but I am a bit confused about the second equality where $E[E[Z|W]|W] = E[Z|W]$. I know that if we do the law of iterated expectations (in reverse) that this equality follows, but when I was doing the problem, I thought of it a different way and was wondering whether that is valid or not.
My logic is as follows: $E[Z|W]$ is a 'function' of the value of W, and therefore when W is 'given', $E[Z|W]$ is a constant and therefore the expectation of the constant is the constant itself: $E[E[Z|W]|W] = E[Z|W]$. But now $E[Z|W]$ is again a 'function' of W, and no longer a constant. That is where my confusion is.
So I am basically trying to understand the meaning of $E[E[Z|W]|W]$ and whether my logic above makes sense, and if not, where am I making the missteps.
Thanks in advance!
EDIT: Some more clarification. https://math.stackexchange.com/q/1915328. This answer performs the following: \begin{aligned} E[(Y-E[Y|X])(E[Y|X]-f(X))|X] = (E[Y|X]-f(X))E[Y-E[Y|X]|X]=0. \end{aligned} So it seems like he basically treated $E[E[Y|X]-f(X))|X]$ as a constant as well to factor it out of the Expectation. So I'm wondering how it works that a constant turns back into a function (or if my logic is completely off).
You know that $$\mathbb E[Z\mid W]=g(W) $$ for some (measurable) function $g$.
Now you can either try to find the conditional distribution of $g(W) $ given $W=w$ and use it to find $\mathbb E[g(W) \mid W] $. Another way is to use the pull out of known variables property to get $$\mathbb E[g(W) \mid W] =g(W) \mathbb E[1\mid W] = g(W) $$
However, if you use the measure theoretic definition the thing that you want to prove follows directly from the fact that the conditional expectation $\mathbb E[Z\mid W] $ is "unique" in some sense.