Suppose we have a random variable (RV) $X$ defined on a measurable space $\mathcal{M} = (\Omega, \Sigma)$. Suppose we equip the measurable space with a probability measure $P$ and associated expectation operator $E$ such that for all $\theta \in \mathbb{R}$ we have $E[e^{\theta X}] < \infty$. Then, for all bounded, continuous $f : \mathbb{R} \rightarrow \mathbb{R}$ define $F_{\theta}$ as follows:
$$F_{\theta}(f) = \dfrac{E[f(X)e^{\theta X}]}{E[e^{\theta X}]}$$
Question: Show that there exists a probability measure $P_{\theta}$ defined on $\mathcal{M}$ such that for all bounded continuous functions $f$ the following identity holds:
$$F_{\theta}(f) = E_{\theta}[f(X)]$$
where $E_{\theta}[Y]$ is the expectation of any RV $Y$ w.r.t. $P_{\theta}$.
Background: Around 45 minutes into this video, the lecturer defines an operator $E_{\theta}$ w.r.t to a moment generating function and makes a brief argument as to why this operator is in fact the expectation of an implicitly defined probability measure $P_{\theta}$. I can't see how this conclusion can be made.
Note: If the above result is, or relies on, some known theorem I'd be delighted if someone could point me in the direction of that.
Note: in expressions such as $f(X)$ and $e^{\theta X}$ we are composing the RV $X$ with continuous functions to build new RVs. Also in the expression $f(X)e^{\theta X}$ we use both composition with $X$ and mutliplication of two RVs to get a new RV.
Note: I'm not sure if maybe we need to restrict $\theta > 0$. I'm thinking about it...
Using the definition of expectation and the transfer theorem, we have that :
$$ F_{\theta}(f) := \frac{\mathbb{E}[f(X)e^{\theta X}]}{\mathbb{E}[e^{\theta X}]} = \frac{1}{\mathbb{E}[e^{\theta X}]} \int_{\omega \in \Omega} f(X(\omega))e^{\theta X(\omega)}d\mathbb{P}(\omega) \tag{1}$$
Similarly, we can compute $\mathbb{E}_{\theta}[f(X)]$ :
$$ \mathbb{E}_{\theta}[f(X)] = \int_{\omega \in \Omega} f(X(\omega)) d\mathbb{P}_{\theta}(\omega) \tag{2} $$
If (1) and (2) are equal, we have the equality :
$$\int_{\omega \in \Omega} f(X(\omega)) d\mathbb{P}_{\theta}(\omega) = \int_{\omega \in \Omega} f(X(\omega))\frac{e^{\theta X(\omega)}d\mathbb{P}(\omega)}{\mathbb{E}[e^{\theta X}]} \tag{3} $$
A sufficient condition for equality (3) to hold is to have
$$ d\mathbb{P}_{\theta}(\omega) := \frac{e^{\theta X(\omega)}}{\mathbb{E}[e^{\theta X}]}d\mathbb{P}(\omega) \quad \forall \omega \in \Omega \tag{4} $$
Or, written differently :
$$ \mathbb{P}_{\theta}(A) := \int_{\omega \in A} \frac{e^{\theta X(\omega)}}{\mathbb{E}[e^{\theta X}]}d\mathbb{P}(\omega) \quad \forall A\in \Sigma \tag{4'}$$
Now if condition (4) is fulfilled, you can check that $\mathbb{P}_{\theta}$ defines a probability measure on $(\Omega,\Sigma) $ and that it is by construction such that $F_{\theta}(f) = \mathbb{E}_{\theta}[f(X)]$.
Intuitively, what we want is to find a measure $\mathbb{P}_{\theta}$ under which the expectation is such that $\mathbb{E}_{\theta}[f(X)] = \frac{\mathbb{E}[f(X)e^{\theta X}]}{\mathbb{E}[e^{\theta X}]} $. The naïve way one might try to find such a measure would be to simply "pull out" the $e^{\theta X}$ in the expectation term and set $``\mathbb{P}_{\theta} = \frac{e^{\theta X}}{\mathbb{E}[e^{\theta X}]} \times \mathbb{P}"$. The law of the unconscious statistician (which has quite a fitting name here) allows to write that intuition in a more formal way and shows that it is (somewhat) correct.