Integrals are defined in terms of series, so why is the treatment of definite integrals different than the treatment of series when the lower limit is greater than the upper limit.
For a definite integral whose lower limit $a > b$ upper limit, we generally accept that
$$\int_a^b f(x)\,dx := - \int_b^a f(x)\,dx \qquad (1)$$,
whereas for a series/sum different whose lower limit $a > b$ upper limit, we generally consider it an empty sum equal to $0$
$$\sum_{i=a}^b x_i := 0 \qquad (2)$$
Since integrals are defined in terms of series (AFAIK), where does this difference arise? Is it just a matter of convention? What's useful?
My best guess is
Because we define definite integrals in terms of their antiderivative
$$\int_a^b f(x)\,dx = F(b) - F(a)$$
it is necessary to have the property
$$\int_a^b f(x)\,dx = -\int_b^a f(x)\,dx$$
to satisfy certain desirable properties.
In contrast, series are not defined in terms of some antisummand, so it is not necessary to have an analogous property.
However, from the point of view that integrals are defined in terms of series, this attempt is not fully satisfactory.
I disagree with certain remarks and premises of your question, namely
We don't define integrals in terms of antiderivatives. As per your initial characterization, we define them in terms of sums (Riemann, Darboux, etc.). However, the fundamental theorem of calculus links the two concepts together.
That is not the "general" convention (though it may be adopted in some particular situations). A more convincing and consistent convention that is analogous to that adopted for integrals is explained below. Moreover, as you will see, the fact that a sum of the form $\sum_{k=c+1}^{c}$ is assigned a value of zero (as per the question in the link you share, where $c=-1$) is a special case of this more consistent convention, and is directly analogous to integrals of the form $\int_c^c$ being assigned a value of zero.
Recall the property that for $a< b< c$,
$$\int_a^c f(x)dx=\int_a^b f(x)dx+\int_b^c f(x)dx \quad (1)\\ \implies \int_a^b f(x)dx=\int_a^c f(x)dx-\int_b^c f(x)dx.\quad (2)$$
Note that
$$\int_c^b f(x)dx:=-\int_b^c f(x)dx\quad (3)$$
is a convention. This convention is useful since $(3)$ allows $(2)$ to be written as
$$\int_a^b f(x)dx=\int_a^c f(x)dx+\int_c^b f(x)dx,$$
so the property $(1)$ is preserved regardless of the ordering of $a,b,c.$ Of course, the convention is also intuitive in terms of interpreting the integral as representing signed area.
It's not hard to see the analogous convention can be adopted for sums.
We know for integers $a< b< c,$
$$\sum_{k=a}^c x_k=\sum_{k=a}^b x_k+\sum_{k=b+1}^c x_k\quad (1')\\ \implies \sum_{k=a}^b x_k=\sum_{k=a}^c x_k-\sum_{k=b+1}^c x_k.\quad (2')$$
Adopting the convention
$$\sum_{k=c+1}^b x_k:=-\sum_{k=b+1}^c x_k\quad (3')$$
allows $(2')$ to be written as
$$\sum_{k=a}^b x_k=\sum_{k=a}^c x_k+\sum_{k=c+1}^b x_k,$$
thus preserving $(1')$ regardless of the ordering of $a,b,c.$
You can see that applying the convention $(3')$ to the case $b=c$ implies any sum of the form $\sum_{k={c+1}}^c$ is assigned the value zero. But this convention does not assign zero to all sums with an "upper bound" that is less than the "lower bound." The case where the upper bound is exactly one less than the lower bound is assigned zero for consistency.