Surprisingly, (to me) the relation defined as $(A,B) \in \leq_q$ if $p(A \leq B) \geq q$ is not generally a partial order.
I would ideally like to know the weakest conditions sufficient for it to be true. I don't really mind whether the definitions use $\lt$ instead, or the desired conditions secure a strict partial order. For just one example, is it true when all underlying variables are jointly Gaussian distributed with arbitrary covariance? Obviously the relation I have in mind is based on the (possibly dependent) joint distribution.
For a quite trivial example that induces a strict partial order, take an independent trivariate Gaussian with means $0,1,2$ and unit covariance has $p(A<B) > 0.5$, $p(B<C)>0.5$ and $p(A<C) > 0.5$. Also obviously $p(A<C)$ is greater than either $p(A<B)$ or $p(B<C)$. This is a transitive, irreflexive, anti-symmetric relation. This is a strict partial order.
Here are some conditions under which we can form a poset in this way, although there may be a more general solution.
Let $\mathcal{F}$ be a set of random variables. For any $X,Y \in \mathcal{F}$, $X \neq Y$, set $Z= Y-X$; assume that
(i) $P(Z > E[Z]) = 0.5$, and
(ii) the cdf $f(x) = P(Z < x)$ is strictly increasing for $x \in (E[Z] - \epsilon, E[Z] + \epsilon)$ for some $\epsilon$.
Again setting $Z = Y-X$, we claim that $P(Z > 0) > 0.5$ if and only if $E[Z] > 0$. For suppose that $E[Z] > 0$; then $P(Z > 0) > P(Z >E[Z]) = 0.5$, by (i) and (ii); and similarly if $E[Z] \leq 0$ then $P(Z >0) \leq P(Z > E[Z]) = 0.5$ by (i).
Defining $X<_PY$ if $P(X <Y) > 0.5$, it follows that $X<_PY$ if and only if $E[X] < E[Y]$, and so $S$ is a poset isomorphic to the poset of the multiset $\{E[X] | X \in S\}$ in the usual ordering in $\mathbb{R}$.
In particular, any set of Gaussian random variables, regardless of covariance, satisfies the criteria. It's possible that (ii) could be removed or replaced; I'm not sure how to proceed in that case.
Regarding your other comment, if we define $X<Y$ to mean $P(X<Y) > p$ for some $p > 0.5$, this forms a poset if we add some more restrictions, for example that they are independent Gaussians with the same variance $\sigma^2$. If that's the case, then $P(X<Y) > p$ is equivalent to $E[Y] - E[X] > f(p)$ where $f(p)$ is defined by $P(N >f(p)) = p$ where $N \sim N(0,\sigma\sqrt{2})$. Probably there are some less restrictive hypotheses we could give.