I am working on the following theoretical statistics problem:
Prove that $E(E(\phi(S)|T)-E(\phi(S)))=0$. Where E denotes Expected value, S and T are statistics (function of data) not further specified and $\phi$ is some function of S not further specified. I am not sure whether to use properties of the formal definition of conditional excpectation, such as Proposition 8.13 from Alan Karr "Probability" which states that $E[E[X|Y_1 \dots Y_n]]=E[X]$, or if I should try to show that $E(\phi(S)|T)=E(\phi(S))$, in which case, I do not know how to approach since I do not see the connection. Any help is appreciated!
A hint
The property that you named is very useful, but as it stands you have a subtraction of random variables inside the outermost expectation rather than just one conditional expectation. If you were to isolate each term in the subtraction, then for one of them you will be able to use said property. What happens to the other term?
A solution
$$\mathbb{E}[\mathbb{E}[\phi(S)|T]-\mathbb{E}[\phi(S)]] = \mathbb{E}[\mathbb{E}[\phi(S)|T]]-\mathbb{E}[\mathbb{E}[\phi(S)]]=\mathbb{E}[\phi(S)]-\mathbb{E}[\phi(S)]=0$$ where we use the property you named for the first term and the fact that $\mathbb{E}[\phi(S)]$ is a constant random variable for the second term.
A further comment
A comment, the expression $\mathbb{E}[\phi(S)|T]$ denotes a random variable which is a function of $T$, so trying to prove that $\mathbb{E}[\phi(S)|T]=\mathbb{E}[\phi(S)]$ is not a good idea since it will not hold in the general case.