Suppose random variables $X_1,X_2,X_3$ are IID and bounded by $[0, R]$.
Why is it obvious that
$$ P(X_1 < X_3 | X_1 < X_2) = \int_0^R f_{X_1}(x_1 | X_1 < X_2)P(x_1 < X_3) dx_1 $$ ?
I think I could get to this formula by using Bayes rule, but in my notes, it seems this result was readily known, and it's not clear to me how that is.
Is it some variant of the law of total probability applied to a conditional probability?
Let events be defined bt $A=(X_1\le X_3)$, $B=X_1\le X_2)$, and $C=(X_2\le X_3)$ So $P(A|B)=\frac{P(A\cap B)}{P(B)}$.
The numerator $P(A\cap B)=P(A\cap B\cap C)+P(A\cap B\cap C')=P(X_1\le X_2\le X_3)+P(X_1\le X_3\lt X_2)$$=\int\limits_0^R\int\limits_0^{x_3}\int\limits_0^{x_2}f_1(x_1)f_2(x_2)f_3(x_3)dx_1dx_2dx_3$$+\int\limits_0^R\int\limits_0^{x_2}\int\limits_0^{x_3}f_1(x_1)f_2(x_2)f_3(x_3)dx_1dx_3dx_2$
The denominator $P(B)=P(X1\le X_2)=\int\limits_0^R\int\limits_0^{x_2}f_1(x_1)f_2(x_2)dx_1dx_2$