X is a random variable while A is an event. I try to prove $$E(X)=E(X|A)P(A)+E(X|A^c)P(A^c)$$
Assuming $X$ is a continuous random variable, if $$E(X|A)=\int_{x\in A}xf_{X|A}(x)dx=\int_{x\in A}xf_{X,A}(x)/P(A)dx $$ $$E(X|A^c)=\int_{x\in A^c}xf_{X|A^c}(x)dx=\int_{x\in A^c}xf_{X,A^c}(x)/P(A^c)dx $$, and $f_{X,A}(x)=f_{X,A^c}(x)$ hold, then equality is obviously true. But the problem is I don't know whether the joint probability density function $f_{X,A}(x)$ is well defined here and if $f_{X,A}(x)=f_{X,A^c}(x)$ .
I've seen this conditional expectation equality somewhere and intuitively it makes good sense. But I hope to see a rigorous proof for it. Do I have to dig into measure theory to prove this? Thanks!