chain rule for the conditional probability from measure-theoretic point of view

221 Views Asked by At

I am studying the probability theory from Durrett's book. Chapter 5 begins with the discussion on the conditional probability. I have a difficulty in understanding the relationship between measure-theoretic view of conditional expectation and my undergrad-level knowledge of probability.

Assume we have three random variables X, Y, and Z. In the undergrad-level probability course, we had chain rule for the conditional probability which says $$ P(X\vert Y) = \int P(Z=z\vert Y) P(X\vert Y,Z=z) dz. $$ Now, I am trying to understand the above equality with measure theory tools. Specifically, we can write $$ P(X\in A \vert \sigma(Y)) = \mathbb{E}[1_{X\in A}\vert \sigma(Y)]. $$ I do not know how to proceed after this point to recover the chain rule for conditional probability. I would really appreciate it if you could help me on this issue.

1

There are 1 best solutions below

0
On BEST ANSWER

$$ P(X\in A \vert \sigma(Y)) = \mathbb{E}[1_{X\in A}\vert \sigma(Y)] $$

$$ \overset{Tower \ Property}{=}E\bigg(\mathbb{E}[1_{X\in A}\vert \sigma(Y)] \bigg|\sigma(Y,Z)\bigg) $$ Tower property since $\sigma(Y)\subset \sigma(Y,Z)$ Conditional_expectation

$$ \overset{Tower \ Property}{=}E\bigg(\mathbb{E}[1_{X\in A}\vert \sigma(Y,Z) ] \bigg|\sigma(Y)\bigg) $$ $$=E\bigg(\mathbb{E}[1_{X\in A}\vert (Y,Z) ] \bigg|Y\bigg) $$

$$ =E\bigg(g(Y,Z) \bigg|Y \bigg) $$

$$ =\int g(Y,Z=t) f(Z=t|Y) dt $$ $$ =\int \mathbb{E}\bigg(1_{X\in A}\vert (Y,Z=t)\bigg) f(Z=t|Y) dt $$ $$ =\int P\bigg\{X\in A\vert(Y,Z=t) \bigg\}f(Z=t|Y) dt $$

so
$$ P(X\in A \vert \sigma(Y)) =\int P\bigg\{X\in A\vert Y,Z=t \bigg\}f(Z=t|Y) dt $$ so $$ P(X\leq x \vert \sigma(Y)) =\int P\bigg\{X\leq x\vert Y,Z=t \bigg\}f(Z=t|Y) dt $$

if $X$ is continues you can derive by $x$ in both side and get conditional density.