Given two discrete random variables $X_1$ and $X_2,$ if we want to calculate $\mathbb{E}[\min(X_1,X_2)],$ it is useful to consider the identity $$\mathbb{E}(X) = \sum_{x=1}^\infty P(X\geq x)$$ as $$P(\min(X_1,X_2)\geq x) = P(X_1\geq x) P(X_2\geq x).$$
Question: Can we express $E[\max(X_1,X_2)]$ in terms of $P(\max(X_1,X_2)\leq x)?$
Since we know that $$P(\max(X_1,X_2)\leq x) = P(X_1\leq x) P(X_2\leq x),$$ it might simplify calculations greatly.
I don't know how to answer your question in a way that's useful, but here are three different thoughts that may be relevant.
For shorthand, let $U = \min(X_1, X_2), V = \max(X_1, X_2)$.
First: As @SandeepSilwal pointed out,
$$E[V] = \sum_{v \ge 1} P(V \ge v) = \sum_{v \ge 1} (1 - P(V < v))$$
So that technically answers your question, although the "usefulness" is questionable. Note that $v\to \infty \implies P(V < v) \to 1$, which explains why we cannot separate the last sum into $\sum_{v \ge 1} 1 - \sum_{v \ge 1} P(V < v)$ which would be a kind of "$\infty - \infty$".
Second: If what you really want is "easy" calculation, and you consider the formula for $E[U]$ to be "easy", then this extra step should make $E[V]$ also easy:
$$X_1 + X_2 = U + V \implies E[V] = E[X_1] + E[X_2] - E[U]$$
Third: You never mentioned this, but the formula $E[Z] = \sum_{z=1}^\infty P(Z \ge z)$ only works if $Z \in \mathbb{N} \cup \{0\}$, i.e. $Z$ is non-negative integral. It does not work e.g. if $Z$ can take negative integral values. This immediately implies that, if your original $X,Y$ take only non-positive integral values, then the formula for min works for $-X, -Y$ and so it also works for $\max(X,Y) = - \min(-X, -Y)$.
$$E[V] = -E[\min(-X,-Y)] = -\sum_{v \ge 1} P(\min(-X,-Y) \ge v) \\ =-\sum_{v \ge 1} P(-X \ge v)P(-Y \ge v) = -\sum_{v \ge 1} P(X \le -v) P(Y \le -v)$$