Conditional expectation and Bayes' rule

145 Views Asked by At

I saw this equation from a book:

$P(y \ | \ x) \ = \ \displaystyle \frac{\sum_s P(y, x, s)}{\sum_{y,s} P(y, x, s)}$

The author said this conditional expectation is based on the Bayes' rule, but I just couldn't get it. Would you please explain to me why the LHS equals to the RHS?

1

There are 1 best solutions below

3
On BEST ANSWER

Recall the definition of conditional probability; for two events $A,B$, we have

$$P(A\mid B)=\frac{P(A\cap B)}{P(B)}=\frac{P(A\cap B)}{P(A\cap B)+P(A^c\cap B)}$$

where the equality in the denominator is due to the law of total probability.

For random variables $X,Y$, the same expression would be

$$P(Y=y\mid X=x)=\frac{P(X=x,Y=y)}{P(X=x)}=\frac{P(X=x,Y=y)}{\sum_yP(X=x,Y=y)}$$

where $P(X=x,Y=y)$ is the joint probability that $X=x$ and $Y=y$.

Your question seems to consider a third random variable $S$, in which case you would expand each of the probabilities above (using the LoTP) to consider all possible values of $S$, so that

$$P(Y=y\mid X=x)=\frac{\sum_sP(X=x,Y=y,S=s)}{\sum_{y,s}P(X=x,Y=y,S=s)}$$