Consider two distinct probability distributions $P(X)$ and $Q(Y)$---defined on the same domain---with (Shannon) entropy of $H(X)$ and $H(Y)$. I am interested to prove (or disprove) that $$ H(X) \leq H(Y) \implies \sum_{x}\sum_{x'} P(x)P(x')|x-x'| \leq \sum_{y}\sum_{y'} Q(y)Q(y')|y-y'| $$
Any answer, comments, or directions would be appreciated.
It's not clearly stated what type of domain you assume, and what the distance measure is, so I'll pick a simple case of sets in $\mathbb{R}$ with the normal distance measure to provide a counter-example to your hypothesis.
For $u<v$, define the probability distribution $P_{u,v}$ so that it has point probabilities $1/2$ at $u$ and $v$: i.e., probility $1/2$ of being either $u$ or $v$. This has 1 bit of entropy (or $\ln 2$ in natural units), while the expected distance between two independent random variables is $(v-u)/2$.
Let $X\sim P_{0,a}$ and $Y\sim P_{0,b}$, and by setting $0<b<a$ you get a counter-example.
You can make all sorts of modifications to this example, e.g. replace the point probabilities with short intervals or add background distribution to make measure positive throughout.
Basically, there is no link between the distance measure $|x-x'|$ and the entropy definition which depends only on local probabilities (point probabilities or probability densities).