Let $x = (x_1,\ldots,x_n)$ and $y = (y_1,\ldots, y_n)$ be two probability distributions. The Shannon entropy of $x$ then is $$S(x) = -\sum_i x_i\ln x_i$$
The entropy is maximal when $x$ is the uniform distribution $\bot_n$: $(\bot_n)_i=\frac{1}{n}$. Denote the depolarization channel as $$\Lambda_t(x) = (1-t)x + t\bot_n.$$
Intuitively it would make sense that when $S(x)\geq S(y)$ we would have $S(\Lambda_t(x))\geq S(\Lambda_t(y))$, but this is in general not true. Because suppose it would be true for a pair $x$ and $y$ with $S(x) = S(y)$, then we should have $S(\Lambda_t(x))= S(\Lambda_t(y))$ for all $t$. Now by [1] we have that $x=Py$ for some permutation $P$ which is in general not true for arbitrary $x$ and $y$ with the same entropy.
My question now is: under what conditions is it true that when $S(x)\geq S(y)$ we have $S(\Lambda_t(x))\geq S(\Lambda_t(y))$? Specifically, is this true when we assume that $x_i\geq x_{i+1}$ and $y_i\geq y_{i+1}$ (both ordered from high to low) and $x_1 \geq y_1$?
[1] ] K. He, J.-C. Hou, M. Li, A von Neumann entropy condition of unitary equivalence of quantum states, Appl. Math. Lett. 25 (2012) 1153–1156
I don't know if a general condition can be found, and I don't think that it can depend on some condition on the ordering.
I doubt this helps, but:
The distribution $\Lambda_t(x) $ corresponds to "mixing" of the distribution $x$ and the uniform distribution $u$. Considering an indicator variable $T$, we can write
$$H(\Lambda_t(x))= (1-t) H(x) + t \log n + h(t) - H(T|\Lambda_t(x))$$ Hence
$$H(\Lambda_t(x))-H(\Lambda_t(y))= (1-t) (H(x)-H(y)) - H(T|\Lambda_t(x))+ H(T|\Lambda_t(y))$$
The only thing we can say in general is $0 \le H(T|\cdot)\le h(t)\le 1$ where $h(t)$ is the binary entropy function. Then
$$H(\Lambda_t(x))-H(\Lambda_t(y)) \ge (1-t) (H(x)-H(y)) - h(t)\ge (1-t) (H(x)-H(y)) - 1$$
Hence a sufficient (not necessary) condition for $H(\Lambda_t(x))-H(\Lambda_t(y))\ge 0$ would be
$$ H(x)-H(y) \ge \frac{h(t)}{1-t}$$