$\epsilon_i$ is mean independent of $X_i$, that is $E[\epsilon_i|X_i]=0$.
The above sentence is from my textbook. Is this sentence true? Is the above equation is the definition of mean independent?
Update: I should have given the full texts here. Given $Y_i=E[Y_i|X_i]+\epsilon_i$, we can prove that $\epsilon_i$ is mean independent of $X_i$, that is, $E[\epsilon_i|X_i]=0$.
Proof: $E[\epsilon_i|X_i]=E[Y_i-E[Y_i|X_i]|X_i]=E[Y_i|X_i]-E[Y_i|X_i]=0$
My question: I can understand the second equation. It's just substitute $\epsilon_i$ from the given condition. But how to derive the third equation? Specifically, I don't know how to interpret $E[E[Y_i|X_i]|X_i]$
Consider first $E[Y_i - E[Y_i|X_i] | X_i = x]$.
By linearity of expectation, we have: \begin{align*} E[Y_i - E[Y_i|X_i] | X_i = x] &= E[Y_i|X_i=x] - E[E[Y_i|X_i]|X_i = x] \,\,\,\\\ &= E[Y_i|X_i=x] - E[\underbrace{E[Y_i|X_i=x]}|X_i = x] \end{align*}
Note that $E[Y_i|X_i=x]$ is a deterministic number so we can
so that $E[\underbrace{E[Y_i|X_i=x]}|X_i = x] = E[Y_i|X_i = x]$
This shows $E[Y_i - E[Y_i|X_i] | X_i = x] = 0$
What's the difference between $E[Y_i - E[Y_i|X_i] | X_i = x]$ and $E[Y_i - E[Y_i|X_i] | X_i]$?
The latter is a random variable - a function of $X_i$.
The former is the (deterministic) value realized by the latter random variable when $X_i$ realizes the value $X_i = x$.