Proof of $P(X|Y) = \sum_z P(X,z | Y)$ (check my solution)

54 Views Asked by At

Let X,Y,Z be a random variables. Prove that $P(X= x|Y = y) = \sum_z P(X= x,Z = z | Y = y)$

First of all on the LHD we know, that $P(X = x|Y = y) = \frac{P(X = x,Y = y)}{P(Y = y)}$.

The same we can do on the RHD: $\sum_z P(X = x,Z = z | Y = y) = \frac{\sum_z P(X = x,Z = z , Y = y)}{P(Y = y)}$

We get: $\frac{P(X = x,Y = y)}{P(Y = y)} = \frac{\sum_z P(X = x,Z = z , Y = y)}{P(Y = y)}$

$P(X = x,Y =y) = \sum_z P(X = x ,Z = z , Y = y)$

then on the RHD we use the chain rule of the probabilities:

$P(X = x,Y = Y) = \sum_z P(X = x|Z = z,Y = y) P(Y =y|Z = z) P(Z = z)$

In the case of $\sum_z$ $P(Z = z) = 1$ and $P(X = x|Z= z,Y = y) = P(X =x|Y = y)$

I am not sure about $P(Y = y|Z = z)$ but I suppose it's equal to $P(Y = y)$ as well.

Hence we get $P(X = x,Y = y) = P(X = x|Y = Y) P(Y = y)$ which is correct by definition.

1

There are 1 best solutions below

0
On BEST ANSWER

We have: $$P(X=x\mid Y=y)P(Y=y)=P(X=x, Y=y)$$and: $$P(X=x, Z=z\mid Y=y)P(Y=y)=P(X=x, Z=z, Y=y)$$So if $P(Y=y)>0$ then proving that: $$P(X=x\mid Y=y)=\sum_zP(X=x,Z=z\mid Y=y)$$ comes to the same as proving that $$P(X=x, Y=y)=\sum_zP(X=x, Z=z, Y=z)\tag1$$

Here $(1)$ is an immediate consequence of:$$\{X=x,Y=y\}=\bigcup_z\{X=x,Z=z,Y=y)$$(preassuming that $Z$ is a discrete random variable and above $z$ ranges over a countable set that serves as support of $Z$).