Bayes' Theorem and Law of total propability for CDF

3k Views Asked by At

The calculation of conditional probability is the same for conditional PDF and CDF(according to a number of questionable sources: first, second) (I will use rough notation, with just $x$ and $y$):

$$F(x \ | \ y) = \frac{F(x,y)}{F(y)}, \ \ f(x \ | \ y) = \frac{f(x,y)}{f(y)} $$

The Bayes' Theorem for probability density functions looks like

$$f(x \ | \ y) = \frac{f(y \ | \ x)f(x)}{f(y)}$$

and can be derived using second definition above. Looks like the same can be postulated for cumulative distribution function:

$$F(x \ | \ y) = \frac{F(y \ | \ x)F(x)}{F(y)}$$

Really:

$$\frac{F(x,y)}{F(y)} = F(x \ | \ y) = \frac{F(y \ | \ x)F(x)}{F(y)},$$

$$\frac{F(x , y)}{F(x)} = F(y \ | \ x).$$

Now the question - "law of total probability" for PDFs:

$$f(x \ | \ y) = \frac{f(y \ | \ x)f(x)}{\int_{\Omega}f(y \ | \ x)f(x)dx}$$

Can it be expressed in terms of CDFs somehow (I don't have $F(y)$)?


EDIT: maybe some computational approach. I know that I can set in joint CDF some very big $x$: $\lim_{x \rightarrow \infty} F(x,y) = F(y), \ F(x,y)=F(y\ | \ x)F(x)$, but I can't obtain joint CDF from my data. I know also that I can find find "derivative": $f(x) \approx \frac{F(x+\Delta)-F(x-\Delta)}{\Delta}$, but it will be poor estimate.

I have: $F(y \ | \ x)$ and $F(x)$ estimated - both functions of $x$ ($y$ is actual observation data).