Joint and conditional p.d.f

46 Views Asked by At

Suppose we have a joint p.d.f $f(x,y)$ of a random vector $(X,Y)$ and let $f(x)$ and $f(y)$ be the marginal p.d.f.'s. Let this be the case of continuous random variable. I wanted to ask what is the difference between the random variable $X|Y=y$ and $X|Y>y$. For the former we conveniently define the conditional p.d.f $f(x|y)=\dfrac{f(x,y)}{f(y)}$, so to get the Probability of say $\mathbb{P}(a<X<b|Y=y_1)$ we find this probability by $\int_{a}^{b}f(x|y_1)dx$. Why can't we calculate this by doing $\dfrac{\mathbb{P}(a<X<b \cap Y=y_1)}{\mathbb{P}(Y=y_1)}$ Now the reason for not doing this because the denominator is $0$ due to continuity at $y_1$. So to bypass this, we define the conditional pdf and get the required probability by integration between appropriate limits. So, what does plugging in $y_1$ mean in $f(x|y_1)$? Second to get $\mathbb{P}(a<X<b|Y>=y_1)$ we dont we define it's p.d.f. and integrate within limits to get the probability. Why do we do $\dfrac{\mathbb{P}(a<X<b \cap Y=y_1)}{\mathbb{P}(Y>=y_1)}$ Can someone help me with this?