Consider the function $f:[0,1]\rightarrow \mathbb{R}$ given by
$$f(x)= x\log(x).$$
This function is not Lipschitz continuous at zero, but apparently Hölder continuous for any $\alpha<1$, i.e. there is a constant $C_{\alpha}$ such that
$$\vert f(x)-f(y)\vert \le C_{\alpha} \vert x-y \vert^{\alpha},$$
but how can one show the Hölder continuity?
Please let me know if you have any questions or remarks.
Same plan of attack as $x^\alpha$ as an example of an $\alpha$-Hölder continuous function, which should be reminiscent of proofs using e.g. mean value theorem and the sort.
Note $-x\log x\ge 0$ for $x\in[0,1]$ (with the obvious redefinition at zero). Let $0\le x_0<x\le 1$ and consider $g:[x_0,1]\to\mathbb R$ defined by $$g(x) = -x\log x + x_0 \log x_0 + (x-x_0)\log(x-x_0).$$ Then $$g'(x) = (-\log x - 1) - (-\log(x-x_0) - 1) = \log (x-x_0) - \log x\le 0,$$ which follows because of the monotonicity of $\log$ and $0<x-x_0\le x $. Thus $g$ is decreasing, with $g(x_0)=0$. This implies for any $0\le x<y\le e^{-1}$, (note $-t\log t$ is increasing on $[0,e^{-1}]$)
$$ \frac{(-y\log y) - (-x\log x)}{-(y-x)\log(y-x)} = \frac{|y\log y - x\log x|}{|y-x|\lvert\log \lvert y-x \rvert \rvert} \le 1.$$ Therefore, $$ \sup_{\substack{y\neq x\\x,y\in[0,e^{-1}]}} \frac{|y\log y - x\log x|}{|y-x|\lvert\log \lvert y-x \rvert \rvert} \le 1.$$ For points away from the origin, it is easy to check that the function is Lipschitz (in fact smooth). Therefore, it follows after some routine case checking (see end of answer) that for some $C>0$, $$ \sup_{\substack{y\neq x\\x,y\in[0,1]}} \frac{|y\log y - x\log x|}{|y-x|\max(\lvert\log \lvert y-x \rvert \rvert,1)} \le C.$$ (The maximum is just to avoid 'irrelevant' considerations when $x$ and $y$ are far apart i.e. $|x-y|\to 1$. A function satisfying this type of inequality is said to be 'Log-Lipschitz'.)
This is in fact stronger than being $\alpha$-Hölder continuous: since $ |r|\max(|\log |r||,1)\le C_\alpha |r|^\alpha$ for any $0<\alpha<1$, we immediately have $$ \sup_{y\neq x} \frac{|y\log y - x\log x|}{|y-x|^\alpha} \le C_\alpha \sup_{y\neq x} \frac{|y\log y - x\log x|}{|y-x|\lvert\log \lvert y-x \rvert \rvert} \le CC_\alpha. $$ Indeed, setting $r=e^{-t}$ for $t> 0$, the above is the statement $\max(t,1)e^{-t} \le C_\alpha e^{-\alpha t} \iff \max(t,1) \le C_\alpha e^{(1-\alpha)t}$; so the statement holds with $C_\alpha = \frac1{1-\alpha}$.
(Routine case checks) for any $0<\epsilon<e^{-1}$ the function $x\log x$ is Lipschitz on $[\epsilon,1]$ with constant $\sup_{x\in [\epsilon,1]}| \log x +1|=\max(1,-\log\epsilon-1)=:C_1.$ (This is because the function has Lipschitz constant 1 on $[e^{-1},1]$.) Since we already have control for the region $0<x<y<e^{-1}$, this gives the required control for all $0\le x<y\le 1$ with $|x-y|<e^{-1}-\epsilon=:C_2.$ For the remaining regions of $(x,y)\in[0,1]^2$, it suffices to obtain control in the region $|x-y|>C_2$ where we can crudely bound with $$\frac{|x\log x-y\log y|}{|x-y|\max(\lvert\log\lvert x-y\rvert\rvert,1)}\le \frac{\max_{x\in[0,1]}|x\log x|-\min_{x\in[0,1]}|x\log x|}{|x-y| }=\frac1{eC_2}.$$ Thus, for some fixed sufficiently small $\epsilon$, we can take $$C=\max(1,C_1,\frac1{eC_2})=\max(1,-\log\epsilon-1,\frac1{1-e\epsilon}).$$
Setting $\epsilon=e^{-2}$ gives $C= 2/(1-e^{-1})\approx 1.58$. Numerically, the optimal choice of $\epsilon$ gives $C\approx 1.35$. but $C=1$ seems to be true even for the whole interval $[0,1]$.