How to reason about marginal and joint PMFs in more than two variables?

80 Views Asked by At

To a problem asking to generalize the law of iterated expectations to show that

$$\mathbf E\,[Z\mid X]=\mathbf E\,[\mathbf E\,[Z\mid X,Y]\mid X]$$

, the solution proceeds as follows:

enter image description here

I can't quite grasp why the first underlined product of PMFs is equal to the second underlined PMF. With two variables, one can think visually about marginal and joint PDFs/PMFs as planes (surfaces?) and slices, but not with three or more variables.

Is there a good intuitive way to think about these relationships, or beyond two variables they can only be reasoned about analytically with formulas? Any help or advice would be much appreciated.

1

There are 1 best solutions below

0
On BEST ANSWER

The intuition is that we can imagine we are working in the subspace $\{X=x\}.$ Probabilities then relativize to conditional probabilities on $X=x.$ In particular, a probability $P(A\mid Y=y)$ conditional on $\{Y=y\}$ will relativize to $P(A\mid Y=y, X=x).$ Similarly, the definition of conditional probability $A(A\mid B) = \frac{P(A,B)}{P(B)}$will become $P(A\mid B,C)= \frac{P(A,B\mid C) }{P(B\mid C)}$

More formally, if $A,B$ and $C$ are events then $$P(A\mid B\cap C) = \frac{P(A\cap B\cap C)}{P(B\cap C)} = \frac{P(A\cap B\mid C)P(C)}{P(B\mid C)P(C)} = \frac{P(A\cap B\mid C)}{P(B\mid C)},$$ so $$P(A\mid B\cap C) P(B\cap C) = P(A\cap B \mid C).$$ So, just let $C=\{X=x\},$ $B = \{Y=y\}$ and $A=\{Z=z\}.$