For $a,b\in [0,1]$ and $\epsilon\geq 0$, does the following equality hold?
$a^{1+\epsilon}+b^{1+\epsilon}\geq |a-b|^{1+\epsilon}$
All I can think to do so far is: \begin{align*} |a-b|^{1+\epsilon} = ((a-b)^2)^{(\epsilon+1)/2} \end{align*}
I want to know if I can apply this result, I know such a result is true for $\epsilon=1$, i.e. $a^2+b^2\geq (a-b)^2$
Without loss of generality consider $a\ge b \ge 0$, then
$$a^{1+\epsilon} + b^{1+\epsilon} \ge a^{1+\epsilon} \ge \lvert a-b\rvert^{1+\epsilon}.$$
Using the increasing property of $x^{1+\epsilon}$ for $x\ge 0$:
$$\begin{align*} b &\ge 0 &&\implies & b^{1+\epsilon} &\ge 0\\ a &\ge a-b \ge 0 &&\implies & a^{1+\epsilon} &\ge (a-b)^{1+\epsilon} \end{align*}$$