Let $\left\|x \right\| = \sum_{i=1}^{i=n}\left|x^i\right|$ and $d\left(x\right)=\sum_{i=1}^{i=n}x^i\ln x^i$ where $x\in R^n $ and $ \sum_{i=1}^{i=n}x^i=1$
How to prove: For all $x, x'$, $$\left| d\left(x\right)-d(x') \right|\leq \frac{1}{2}\left\|x-x' \right\|$$
I would appreciate it if you could give me any information about this. I had looked so hard but got nothing.
Thanks.
By the way, the above question is equivalent to the question below. Let $\left\|x \right\| = \sum_{i=1}^{i=n}\left|x^i\right|$ and $d\left(x\right)=\sum_{i=1}^{i=n}x^i\ln x^i$, where $x\in \left\{R^n \mid x\ge0,\sum_{i=1}^{i=n}x^i=1\right\}$ The question is how to prove $d_1\left(x\right)\ge \frac{1}{2}\left\|x-x_0\right\|$ where $x_0= \operatorname{argmin}_x \left\{d_1\left(x\right)\right\}$. Actually $x_0^i=\frac{1}{2},i=1,2,...,n$
This does not seem like it can possibly be true, because the Fannes-Audenaert bound can be saturated (see the example on that page showing that the bound is optimal), thereby providing an example of pairs of states such that no finite Lipschitz constant is possible - note that the binary entropy function $H(\{T,1-T\})$ has infinite derivative at $T=0$.
dohmatob's answer seems to be flawed because I don't see a justification for claiming that $|\log(p_i)+1| \leq \log(n)+1$: the probabilities $p_i$ can be arbitrarily close to zero, which causes $|\log(p_i)+1|$ to diverge. In fact this is essentially the key property that causes the entropy to not be Lipschitz continuous.