Compute a conditional probability given a random parameter

149 Views Asked by At

Let $(\Omega,\mathcal F,P)$ be a probability space. Consider a collection of real random variables $\{X(\gamma)\}_{\gamma\in [0,1]}$ defined on this probability space. Let $Y$ be a random variable taking values in $[0,1]$ and independent of the family $\{X(\gamma)\}_{\gamma\in [0,1]}$ . Am trying to show that, given $B\in\mathcal B(\mathbb R)$,

$$P[X(Y)\in B\mid \sigma(Y)]=g \circ Y \quad (1)$$

with $\gamma\mapsto g(\gamma)=P[X(\gamma)\in B]$.

By the Doob–Dynkin lemma we know there exists measurable $g$ satisfying $(1)$, so the question is how to identify this $g$ with $\gamma\mapsto P[X(\gamma)\in B]$.

Any ideas on how to proceed? Thanks a lot for your help.

1

There are 1 best solutions below

3
On BEST ANSWER

You may use a version of the functional monotone class theorem found in Durrett (Theorem 6.1.3 on page 235), assuming that $(y,\omega)\mapsto X(y,\omega)$ is $\mathcal{B}([0,1])\otimes \mathcal{F}$-measurable.

Consider the class $\mathcal{H}$ of all bounded, nonnegative, measurable functions on $[0,1]\times \Omega$ s.t. for $\varphi\in\mathcal{H}$, $$ \mathsf{E}[\varphi(Y,\,\cdot\,)\mid Y]=g(Y)\quad\text{a.s.}, $$ where $g(y)=\mathsf{E}[\varphi(y,\,\cdot\,)]$. Measurable rectangles $A\times B$ with $A\in\mathcal{B}([0,1])$ and $B\in\mathcal{F}$ s.t. $B$ is independent of $Y$, belong to $\mathcal{H}$ because $$ \mathsf{E}[1_A(Y)1_B\mid Y]=1_A(Y)\mathsf{P}(B). $$ Moreover, $\mathcal{H}$ is closed under addition, multiplication by a constant, and increasing limits (by the (conditional) bounded convergence theorem). It remains to show that $$ \mathcal{P}=\{A\times B:A\in\mathcal{B}(\mathbb{R}), B\in\mathcal{F},B\perp \!\!\! \perp \sigma(Y)\} $$ is a $\pi$-system that contains $[0,1]\times \Omega$. Then, $\mathcal{H}$ contain all bounded functions measurable w.r.t. $\sigma(\mathcal{P})$.


In your case, $\varphi(y,\omega)=1\{X(y,\omega)\in B\}$ for some fixed $B\in\mathcal{B}(\mathbb{R})$.