Denote $\lVert \cdot\rVert=\lVert \cdot\rVert_\infty$. Suppose $f: (\mathbb{R}^d,\lVert \cdot\rVert)\to(\mathbb{R},\lvert \cdot\rvert)$ bounded and absolutely integrable. Assume that $\exists\Delta,L<\infty$ such that:
(a) $f(u)=0$ for $\lVert u\rVert>L$ (compact support) and $\forall u,u'\in \mathbb{R}^d$, we have $\lvert f(u)-f(u')\rvert\leq \Delta \lVert u-u'\rVert$ (Lipschitz)
or
(b) $f$ is differentiable with $\lvert f'(u)\rvert\leq\Delta$ (bounded gradient), and for some $v>1$, $\lvert f'(u)\rvert\leq\Delta\lVert u\rVert^{-v}$ for $\lVert u\rVert>L$ (gradient tends to zero with $u\to\infty$).
Then for any $\lVert x_1-x_2\rVert\leq \delta\leq L$, $$\lvert f(x_1)-f(x_2)\rvert\leq \delta \Delta I(\lVert x_1\rVert\leq 2L)\text{, if (a) holds};$$ and $$\lvert f(x_1)-f(x_2)\rvert\leq \delta \Delta [I(\lVert x_1\rVert\leq 2L)+ \lVert x_1-L\rVert^{-\eta} I(\lVert x_1\rVert> 2L)]\text{, if (b) holds}.$$
My attempt
If (a) holds, $\forall \delta>0:\delta\leq L:\lVert x_1-x_2\rVert\leq \delta$ implies \begin{align} \lvert f(x_1)-f(x_2)\rvert&\leq \Delta \lVert x_1-x_2\rVert \\ &\leq \delta \Delta I(\lVert x_1\rVert\leq 2L) \end{align} since if the point $x_1$ is outside the closed ball with center $0$ and radius $2L$,$B_{2L}(0)$, it means the other point is outside $B_L(0)$, and then $f=0$.
How to obtain the result for (b)? I'm thinking if it's the case to use Mean Value Theorem.
Update
To justify the term $\lVert x_1-L\rVert^{-\eta} I(\lVert x_1\rVert> 2L)$ under (b), from the Mean Value Theorem, for some $z=(1-c)x_1+cx_2, c\in(0,1)$ $$\lvert f(x_1)-f(x_2)\rvert\leq \lvert\nabla f(z)\rvert \lVert x_1-x_2\rVert$$ If $\lVert x_1\rVert> 2L$, $$\lvert f(x_1)-f(x_2)\rvert\leq \Delta \lVert z\rVert^{-v} \delta$$ I'm struggling to show $\lVert x_1-L \rVert \leq \lVert z \rVert$ to get the result. For $d=1$, it is clear. But not so clear for higher dimensions
If you replace $\|x_{1} - L\|$ by $\|x_{1}\| - L$ I guess it would work. Observe that, following your argument when you updated your question, we have to prove that: Given $z = (1-c)x_{1} + cx_{2}$ with $c \in (0, 1)$ then $\|x_{1}\| - L \leq \|z\|$. Note that: $$ \|x_{1} - z\| = c\|x_{1} - x_{2}\| \leq L$$ so, $$ - L \leq - \|x_{1} - z\|$$ then, $$\|x_{1}\| - L \leq \|x_{1}\| - \|x_{1} - z\| \leq \|x_{1} - (x_{1} - z)\| = \|z\| $$ but, the way that this inequality is stated is quite confusing. I hope this helps you.