Bounty Edit: Considering the nature of the problem at hand (i.e. proving that a specific function is measurable), I think this can be an easy but relevant problem. In particular, it is relevant to me, because I am at loss when dealing with this kind of problems, that move from the abstract setting of a book, to a concrete setting given by a specific example.
Here there is the setup of my problem:
measurable $\ g: (X, \Sigma_X) \to (Y, \Sigma_Y)$;
the image measure $$\hat{g} : (\Delta (X), \Sigma_{\Delta(X)}) \to (\Delta (Y), \Sigma_{\Delta(Y)})\hspace{0.5cm} \text{such that} \hspace{0.5cm} \forall E \in \Sigma_Y, \ \hat{g} ( \pi) (E) = \pi \circ g^{-1} (E),$$ where $\Delta (X)$ (resp. $\Delta (Y)$) is the set of all probability measures over $X$ (resp. $Y$), and $\Sigma_{\Delta(\cdot)}$ denotes the $\sigma$-algebra of the corresponding set in the subscript, generated by sets of the form $\beta^{p} (E) := \{ \mu \in \Delta (Y) \ | \ \mu (E) \geq p \}$, or $\delta^{p} (F) := \{ \pi \in \Delta (X) \ | \ \pi (F) \geq p \} $, for arbitrary sets $E \in \Sigma_Y$ and $F \in \Sigma_X$.
In general, I denote measures in $\Delta (X)$ with $\pi$, and measures in $\Delta (Y)$ with $\mu$.
Proposition: $\hat{g}$ is measurable.
Attempted proof: This amounts at proving that, for every $E \in \Sigma_Y$, $\hat{g}^{-1} ( \beta^p (E) ) \subseteq \Sigma_{\Delta(X)}$, that is the same as proving that there is a $F \in \Sigma_X$ such that $g(F) = E$, and $\hat{g} ( \delta^p (F) ) = \beta^p (E)$. Notice that the definition of image measure implies that, for every $E \in \Sigma_Y$, there is a $F \in \Sigma_X$ such that $g(F) =E$ and $\mu (E) \equiv \hat{g} (\pi) (E) = \pi \circ g^{-1} (E) \equiv \pi (F)$ (Right?). Thus, we simply have to prove that $\hat{g} ( \delta^p (F)) = \beta^p (E)$.
To establish the result, the following chain of biconditionals works: $$ \hat{g} ( \delta^p (F)) \Longleftrightarrow \forall \pi \in \delta^p (F) \ \exists \mu \in \beta^p (g(F)) \Longleftrightarrow \exists \mu \in \beta^p (E) \Longleftrightarrow \beta^p (E). $$ $\square$
I am wondering if the result and my line of reasoning are correct, because I really have problems in seeing exactly what I have to do to concretely to prove that a specific function is measurable.
Any help or feedback is most welcome.
Thank you for your time.
In order to make sense of the question, one needs $\sigma(X)$ and $\sigma(Y)$, two $\sigma$-algebras on $X$ and $Y$ to talk about $g$ being measurable. Now assume $g\colon (X, \sigma(X)) \to (Y, \sigma(Y))$ is measurable.
Also, I believe $\Delta(X)$ and $\Delta(Y)$ are sets of probability measures with respect to $(X, \sigma(X))$ and $Y, \sigma(Y)$. And $\Sigma_X$ should be generated by $$\{\pi\in\Delta(X):\pi(F)\ge p\},\text{ where }F\in\sigma(X)\text{ and }p\in\mathbb{R},$$ and $\Sigma_Y$ should be modified likewise.
In order to show the proposition, we need:
Claim For $F\in \sigma(Y)$ and $p\in\mathbb{R}$, we have $$\cup\{\hat{g}^{-1}(\mu) : \mu\in\Delta(Y), \mu(F)\ge p\} = \{\pi \in \Delta(X):\pi(g^{-1}(F)) \ge p\}.$$
Proof of Claim
First of all, we show LHS $\subset$ RHS. Pick $\pi\in$ LHS, that is $\pi \in \hat{g}^{-1}(\mu)$ for some $\mu\in\Delta(Y)$ such that $\mu(F)\ge p$. Then $\mu = \hat{g}(\pi)$, and so $\mu(F') = \pi\circ g^{-1}(F')$ for all $F' \in \sigma(Y)$. In particular, $\pi(g^{-1}(F)) = \mu(F)\ge p$, hence $\pi\in$ RHS.
Secondly, we show RHS$\subset$LHS. If $\pi\in$ RHS, that is, $\pi\in\Delta(X)$ and $\pi(g^{-1}(F))\ge p$, then $\mu: Y\to[0,1]$ defined by $\mu(F') = \pi\circ g^{-1}(F')$ for all $F'\in \sigma(Y)$ is indeed a probability measure on $(Y, \sigma(Y))$, namely $\mu\in\Delta(Y)$. Moreover, since $\mu(F)=\pi\circ g^{-1}(F) \ge p$ and $\mu=\hat{g}(\pi)$ (by definition of $\mu$), we know that $\pi\in$ LHS.
Consequence of Claim
In the claim, LHS is the inverse image of a generating element of $\Sigma_Y$ and RHS is a generating element of $\Sigma_X$ since $g^{-1}(F)\in\sigma(X)$. $\hat{g}$ is therefore measurable.