Regular conditional probability of sum of Bernoulli random variables given zero set

75 Views Asked by At

I'm reading with Achim Klenkes Probability Theory (https://link.springer.com/book/10.1007/978-1-4471-5361-0) and try to understand regular conditional distributions. For this sake I'm trying to check the definition in this example:

Let X be uniformly distributed on $[0, 1]$ and given $X=x$ let $(Y_i)_{i=1}^n$ be independent Bernoulli-distributed with parameter $x$. Klenke states that the regular conditional distribution is $\kappa_{Y,X}(x,\cdot)= \mathbb{P}[Y \in \cdot | X=x] = (Ber(x))^{\otimes n}$ for almost all $x\in [0,1]$.

He defines a Markov kernel:
enter image description here

and the regular conditional distribution as:
enter image description here

Note that $A$ and $B$ switch the roles for some reason.
I think we chose $(E',\mathcal{E}')=([0,1],\mathcal{B}(\mathbb{R})|_{[0,1]})$ and $(E,\mathcal{E})=(\{1,...,n\},\mathcal{P}(\{1,...,n\}))$. Then we have to show that $$\int_{\Omega} 1_{\{Y \in B\}} 1_A d\mathbb{P} = \int_{\Omega} \kappa_{Y,X}(\cdot,B) 1_A d\mathbb{P} \ \ (1)$$ for all $B = \{i\}, i =1,...,n$ and $A = \{X \in [0,t]\} \in \sigma(X)$ by generator argument? But I'm really confused how to plug $(Ber(x))^{\otimes n}$ and $\kappa_{Y,\sigma(X)}(X^{-1}(\omega),B)$ together such that the integration area is still $\Omega$. Could someone give me a hint how to proceed with the RHS of (1)?