The problem is as follows: let $S_X = (X_1, ... ,X_n)$ and $S_Y = (Y_1, ... ,Y_n)$ two iid samples whose probability distributions are compactly supported on $(0,1)$ with a.c. densities $(f_X,f_Y)$ w.r.t the Lebesgue measure.
We now order the global sample $S = (X_1, ... ,X_n, Y_1...,Y_n)$ by increasing values. We define $S_k$ as the number of samples coming from the first sample $S_X$ among the $k$ lower values of $S$.
Is it true that for any integer $k$ lower than $n, l \to P(S_{k+1}=l+1 / S_k=l)$ is decreasing?
The result is intuitive, but I cannot develop any rigorous proof of such a result...
No, if you define the function $$ f_k(l) = P(S_{k+1}=l+1 \vert S_k = l ), $$ I wonder wheter $f_k(l) \geq f_k(l+1)$ ...
I have run a lot of simulations which seems to validate this conjecture, and the extremal choice when $X$ and $Y$ shares the same distribution is $$ f_k(l)=\frac{n-l}{n-k}, $$ which is of course decreasing in $l$.