I'm currently self-studying Andrew Gelman's book "Bayesian Data Analysis" third edition. At the page 41, they write:
$E(\tilde{y}|y)=E(E(\tilde{y}|\theta,y)|y)$
I am ok with multiple conditions, and usually ok with the math involved in the book. But this notation with a nested condition confuses me. I tried to find a definition for nested conditions in past personal notes/books, but I did not find the definition. Online, I saw something similar in the "Tower property" there https://en.wikipedia.org/wiki/Conditional_expectation#Basic_properties. But that page uses notation and concepts that the book does not use and that are a bit abstract.
I have a feeling that this is the definition I'm looking for:
$Pr[(A|B)|C]:=Pr[A|B,C]$ for events, or $E(E(x|z,y)|y):=E(E(x|z,y))=E(x|y)$ for iterated expectation of random variables.
Does someone have an online reference that defines nested conditions that confirms (or not) my feeling guessed definition? If I'm wrong, what would be the meaning of $Pr[(A|B)|C]$ ,and $E(E(x|z,y)|y)$ (if any)?
Thank you for your help!
With the help of @Mason, and @William M., and taking the time to review more in depths the law of total expectation, I found the source of my confusion : I was incorrectly using the law of total expectation.
The law of total expectation says $E[U]=E(E[U|Z])$, but it also says that $U$ and $Z$ must be from the same probability space. My mistake was to consider $U=(X|Y=y)$, add the condition with $Z$ ending up with $E[E(X|Z,Y=y)]$, and then I did not understand the necessity for the extra $|Y=y$. I was just adding a condition ignoring that the random variables were living in a restricted probability space. This was my mistake.
$E[X|Y=y]=E[E(X|Z,Y=y)|Y=y]$ is really a direct consequence of $E[U]=E(E[U|Z])$ but with the restriction that everything is happening in the probability space defined by the condition $Y=y$. We add the condition $Z$, but we remain in the probability space $|Y=y$. This is the meaning behind the extra $|Y=y$ I did not write at my first attempt.
$E[E(X|Z,Y=y)|Y=y]\ne E[E(X|Z,Y=y)]$ and I have a good example from the book of Bayesian statistics. In my original problem $E[E(\tilde{Y}|\theta ,Y)|Y]=E[\theta|Y]=\mu_1$ the posterior mean but $E[E(\tilde{Y}|\theta ,Y)]=E[\theta]=\mu_0$ the prior mean.
I don't think my digression on expectation written with subscripts is necessary anymore. So I removed it.
I hope that this answer will help if someone stumbles over the same difficulty!
PS: I think it also clarifies the question about $(A|B)|C$.