Conditional Probability of Discrete Independent Events

40 Views Asked by At

Let $Y$ and $Z$ be discrete, independent random variables. Then

$$P(Y = i | Y < Z) = P(Y = i)$$

Right? Because $Y$ and $Z$ are independent, the fact that $Y < Z$ doesn't tell us anything about $Y$, right?

On the other hand, using the conditional probability formula:

$$P(Y = i | Y < Z) = \frac{P(Y = i \wedge Y < Z)}{P(Y < Z)}$$

$$P(Y = i | Y < Z) = \frac{P(i < Z)}{P(Y < Z)}$$

$$P(Y = i | Y < Z) = \frac{P(i < Z)}{\sum_{0}^{\infty}P(Z > y) \cdot P(Y=y)}$$

Which seems to give a completely different answer that depends on $Z$...

What am I doing wrong here?

1

There are 1 best solutions below

2
On BEST ANSWER

The Bernoulli random variable ${\bf 1}_{Y<Z}$ is not independent of $Y$ as noted in the comments.

I don't agree with either of your formulas. It should be the following for $Y$ independent of $Z$:

$$ P(Y=i|Y<Z)=\frac{P(Y=i,Y<Z)}{P(Y<Z)}=\frac{P(Y=i,i<Z)}{P(Y<Z)}=\frac{P(Y=i)P(i<Z)}{P(Y<Z)},$$

where independence was used in the last step. Assuming $Y,Z$ are integer-valued, we have the convolution

$$P(Z-Y=j)=\sum_{k=-\infty}^\infty P(Y=-k)P(Z=j-k),$$

so that the denominator can be expressed in terms of the marginals as

$$P(Z-Y>0)=\sum_{j=1}^\infty \sum_{k=-\infty}^\infty P(Y=-k)P(Z=j-k).$$