If $X$ is a discrete random variable and $B$ is an event, the law of total probabilitysays that whenever $A_i, i \in \Lambda$ for a partition of $\Omega$, we have $$\mathbb{P}(B)=\sum_{i \in \Lambda}\mathbb{P}(B|A_i)\mathbb{P}(A_i)$$ and we can deduce that $$\mathbb{E}(X)=\sum_{i \in \Lambda}\mathbb{E}(X|A_i)\mathbb{P}(A_i)$$
I know that for a continuous random variables $T$ and $X$ we have that $$\mathbb{E}(X)=\int_0^{\infty}\mathbb{E}(X|T=s)f_T(s)ds$$
which is proven using limits.
Is there something similar for the probability? Something along the lines of
$$\mathbb{P}(X<x)=\int_0^{\infty}\mathbb{P}(X<x|T=s)f_T(s)ds$$
The small answer would be to apply the expectation result to the random variable $$ 1_{X>x} $$ to get the exact expression that you have formulated.