Consider 2 random variables $Y,X$ and a function $p(X)$. I would like to understand the relation between these 3 conditional expectations $$ E(Y|X) $$ $$ E(Y|p(X)) $$ $$ E(Y|X, p(X)) $$
My intuition is that if we condition on $X=x$, we implicitly condition on $p(X)=p(x)$ too so that $$ E(Y|X=x, p(X)=\bar{p})= \begin{cases} E(Y|X=x) & \text{if $\bar{p}=p(x)$}\\ \text{Undefined } & \text{ otherwise}\\ \end{cases} $$ Therefore, $$ E(Y|X)=E(Y|X, p(X)) $$
On the other hand, $$ E(Y|p(X))\neq E(Y|X) $$
Is this correct?
If $p:\mathbb R\to\mathbb R$ denotes a Borel-measurable function then it is correct that: $$\mathbb E[Y\mid X]=\mathbb E[Y\mid p(X),X]$$
This because the smallest $\sigma$-algebra that makes $X$ measurable is the same as the smallest $\sigma$-algebra that makes $X$ and $p(X)$ measurable (this on base of the fact that measurablity of $X$ implies measurablility of $p(X)$).
It is wrong to state that $\mathbb E[Y\mid p(X)]\neq\mathbb E[Y\mid X]$.
For instance it might be that $p$ is the identity function on $\mathbb R$, and in that situation it is not true.
There are also situation where it is true.
For instance if $X=Y$ and is not degenerate and $p$ is a constant function.