Let $X$ be a random variable on a probability space $(\Omega,\mathscr F, P)$. Define a new probability measure $$\tilde P(A) = E[1_A X]$$ for all $A\in\mathscr F$. Let $\tilde E$ be expectation taken with respect to the new measure $\tilde{P}$.
Suppose now that $Y$ is also a random variable $(\Omega,\mathscr F)$. Then intuitively the expectation should be computed as $$ \tilde E [1_A Y] = E[1_A YX], $$ but I'm not sure how to prove this rigorously using the definition.
This is a standard exercise in proving identities with the Lebesgue integral. Prove it first with a simple function, i.e. if $\phi(\omega) = \sum a_i \chi_{A_i}(\omega)$ then $$ \int_A \phi(\omega) \tilde{P}(d\omega) = \int_A \phi(\omega)X(\omega) P(d\omega). $$ Now prove it for non-negative random variables. To do this, it should be known that there is a sequence of increasing simple functions $\phi_n \to Y$, and apply the previous result and monotone convergence theorem. Then the result follows for random variables follows.