A Markov Chain conditional probability question from first principle

287 Views Asked by At

I am stuck proving following result using properties of conditional expectations.

Let random variables $X$, $Y$, and $Z$ be such that for all $A\in\sigma(X)$ and $C\in\sigma(Z)$ almost surely,

$$P(A\cap C|\sigma(Y))=P(A|\sigma(Y))P(C|\sigma(Y))$$

Then show that for any $B\in\sigma(Y)$ $$P(A\cap C|B)=P(A|B)P(C|B).$$

Here conditional probabilities are interpreted as conditional expectations of indicator random variables and $P(A|C)=\dfrac{P(A\cap C)}{P(C)}$.

I started by writing out following $$ P(A\cap B\cap C)=\int_{B} 1\{A\cap C\}\,dP \\ =\int_{B}\mathbb{E}[1\{A\cap C\}|\sigma(Y)]\,dP \\ =\int_{B}\mathbb{E}[1\{A\}|\sigma(Y)]\mathbb{E}[1\{C\}|\sigma(Y)]\,dP $$ But I am unable to manipulate this further

2

There are 2 best solutions below

2
On BEST ANSWER

Consider $\sigma(1_{\{ B\} })\subset \sigma(Y).$ Then

$$P(A\cap C|B)=\frac{P(A\cap B\cap C)}{P(B)}=\frac{1}{P(B)}\mathbb E[\mathbb{E}[1_{\{A\}}1_{\{C\}}1_{\{B\}}|\sigma(1_{ \{B \} })]]$$ $$=\frac{1}{P(B)}\mathbb E[1_{\{B\}}\mathbb{E}[1_{\{A\}}1_{\{C\}}|\sigma(1_{ \{B \} })]]=\frac{1}{P(B)}\mathbb E[1_{\{B\}}\mathbb{E}[1_{\{A\}}|\sigma(1_{ \{B \} })]\mathbb E[1_{\{C\}}|\sigma(1_{ \{B \} })]]=$$ $$=\frac{1}{P(B)}\int_BP(A|\sigma(1_{ \{B \} }))P(C|\sigma(1_{ \{B \} }))dP= \cdots$$ where $$[P(A|\sigma(1_{ \{B \} }))P(C|\sigma(1_{ \{B \} }))](\omega)=\begin{cases}P(A|B)P(C|B),& \text{ if }&\omega\in B\\ 0,& \text{ if }&\omega \not\in B.\end{cases}$$

So, $$\cdots=\frac{1}{P(B)}\int_BP(A|\sigma(1_{ \{B \} }))P(C|\sigma(1_{ \{B \} }))dP=$$ $$=\frac{1}{P(B)}\int_BP(A|B)P(C|B)dP=P(A|B)P(C|B).$$

0
On

We'll assume Y is discrete for simplicity at first. (the general case will be discussed later)

Consider the following three conditions:

  1. For all $A\in\sigma(X)$ and $C\in\sigma(Z)$, $P(A\cap C|\sigma(Y))=P(A|\sigma(Y))P(C|\sigma(Y))$.

  2. For all $A\in\sigma(X)$ and $C\in\sigma(Z)$ and each atom $B$ of $\sigma(Y)$ (i.e. for each event $B$ of the form $Y=y$), $P(A\cap C|B)=P(A|B)P(C|B)$.

  3. For all $A\in\sigma(X)$ and $C\in\sigma(Z)$ and all $B\in\sigma(Y)$, $P(A\cap C|B)=P(A|B)P(C|B)$.

It seems you are asking about how to prove $1\implies3$ which has a counter-example (which will be discussed soon).

The random variable $P(A\cap C|\sigma(Y))$ is just the random variable that takes value $P(A\cap C|Y=y)$ on the event $Y=y$. So, condition 1 and 2 are equivalent and is called conditional independence.

On the other hand, condition 3 is too strong. Note that condition 3 implies independence between X and Z. For a counter-example to $1\implies 3$, imagine a simple random walk on $\mathbb Z$, then the walker's position at time $n=2015$ is not independent to that at time $n=2015+2$. Indeed, knowing the position at time 2015 would restrict the possible positions at time 2015+2 to just three values. Nonetheless, the positions at 2015, 2016, 2017 together satisfy condition 1 (where Y is the position at 2016).

For general (not necessarily discrete or continuous) real-valued random variables, the analogue of the condition 2 would be a bit more involved and would require the use of disintegration of measures, which is usually considered to be an advanced topic. On the other hand, the condition 1 easily transfers without modification to the general case and that is part of why condition 1 is usually taken to be the definition of conditional independence in the general case.