If $\text P_x$ is a family of probability measures and $X$ is a random variable, can we show $\text E_X[1_A\text E_X[g\mid F]]=\text E_X[1_Ag]$?

56 Views Asked by At

Let

  • $(\Omega,\mathcal A)$ be a measurable space;
  • $(E,\mathcal E)$ be a measurable space;
  • $\pi$ be a Markov kernel with source $(E,\mathcal E)$ and target $(\Omega,\mathcal A)$;
  • $\operatorname P_\mu:=\mu\pi$ for every probability measure $\mu$ on $(E,\mathcal E)$ and $\operatorname P_x:=\operatorname P_{\delta_x}=\pi(x,\;\cdot\;)$ for $x\in E$;
  • $\mathcal F\subseteq\mathcal A$ be a $\sigma$-algebra on $\Omega$;
  • $g:\Omega\to\mathbb R$ be bounded and $\mathcal A$-measurable;
  • $A\in\mathcal F$.

By definition of the conditional expectation, $$\operatorname E_x\left[1_A\operatorname E_x\left[g\mid\mathcal F\right]\right]=\operatorname E_x\left[1_Ag\right]\tag1\;\;\;\text{for all }x\in E.$$ Now let $X$ be an $(E,\mathcal E)$-valued random variable on $(\Omega,\mathcal A)$. Are we able to show that $$\operatorname E_X\left[1_A\operatorname E_X\left[g\mid\mathcal F\right]\right]=\operatorname E_X\left[1_Ag\right]\tag2?$$ If not, are we able to impose sufficient assumptions (please take note of my other question for the motivation of this question)

For simplicity, I've first tried to assume that $X(\Omega)$ is countable and hence $$\operatorname E_X\left[1_A\operatorname E_X\left[g\mid\mathcal F\right]\right]=\sum_{x\in X(\Omega)}\operatorname E_X\left[1_A1_{\{x\}}(X)\operatorname E_x\left[g\mid\mathcal F\right]\right]\tag3.$$ However, in order to utilize $(1)$ it seems like we need to assume that $1_{\{x\}}(X)$ is $\mathcal F$-measurable, i.e. that $X$ is $\mathcal F$-measurable ...

1

There are 1 best solutions below

0
On

Keep in mind that $E_X[g|\mathcal{F}]$ doesn't actually depend on $X$. It is an $\mathcal{F}$-measurable function from $\Omega$ to $\mathbb{R}$ that is parameterized by the distribution of $X$, so if $X(\Omega)$ is countable,

$$E_X[1_AE_X[g|\mathcal{F}]] = \sum_{x \in X(\Omega)} E_x[1_A 1_{\{x\}}(X)E_X[g|\mathcal{F}]],$$

which means that the definition of conditional expectation doesn't really apply in this case even if $X$ is $\mathcal{F}$-measurable. You can build some intuition by trying out the above calculation in the case that $\mathcal{F}$ is the trivial $\sigma$-algebra, so that all conditional expectations reduce to unconditional expectations. I'm personally not fond of this notation because it can be a little deceptive. Instead, I prefer to write out all of the involved random variables.

Let everything be defined as in the problem statement. Let $Y: (\Omega,\mathcal{A}) \to (\Omega,\mathcal{A})$ be a random element whose distribution conditioned on $X$ is given by $\pi(X,\cdot)$. Let $\mathbb{P}_X$ be the law of $X$. Note that the joint distribution of $X$ and $Y$ is defined by,

$$E[h(X,Y)] = \int_{E}\int_{\Omega} h(x,y)\,\pi(x,dy)\,\mathbb{P}_X(dx),$$

for all bounded, measurable functions $h: (E\times\Omega,\mathcal{E}\times\mathcal{A}) \to (\mathbb{R}, \mathcal{B}(\mathbb{R}))$. In particular, for any $g': (\Omega, \mathcal{A}) \to (\mathbb{R},\mathcal{B}(\mathbb{R}))$, $E_X[g'] = E[g'(Y)]$. Then for any $A \in \mathcal{F}$ and by definition of the conditional expectation,

$$E_X[1_AE_X[g|\mathcal{F}]] = E[1_AE[g(Y)|\mathcal{F}]] = E[1_Ag(Y)] = E_X[1_Ag],$$

as desired.