I've gotten so caught up in measure-theoretic probability that I'm actually having trouble showing this simple result. Let $X$ be an integrable random variable. Then $$ \mathrm E[X \mid X=x] = \int_{\Omega} X(\omega)\, P^X(\mathrm d\omega \mid x) = \int_{X(\Omega)} x \, P_{X\mid X}(\mathrm dx | x) = ? $$ The first equality is the definition of the conditional expectation of a random variable w.r.t. an event with zero probability, and so $P^X(\cdot \mid \cdot)$ is the regular conditional probability of $P$ given $X$. I then tried to push forward this integral onto the range of $X$ using the conditional distribution of $X$ given $X$, $ P_{X\mid X}(\cdot | \cdot)$, but it's not clear to me either of these integrals equals $x$.
I'm clearly missing something pretty obvious and would appreciate an extra eye!
Edit: this is for a previous version of the question with no independence assumptions.
Start from definitions. You are trying to find $E[X|\sigma(X)]$. This by definition must satisfy $\int_A E[X|\sigma(X)]dP = \int_A XdP$ for all $A\in\sigma(X)$, and be $\sigma(X)$ measurable. Try $E[X|\sigma(X)]=X$ and verify it satisfies all definitions. Since $X(\omega)=x$ is equivalent to $X^{-1}(x)\in\sigma(P)$, it follows that $E[X|X=x]=x$.