Let $x, y$ be vectors in $\mathbb{R}^n$, and $\textrm{softmax}: \mathbb{R}^n \to \mathbb{R}^n$ is a vector function such that: $$\textrm{softmax}(x)_i = \frac{e^{x_i}}{\sum^n_{j=1} e^{x_j}}.$$
What is the lowest constant $L$ such that: $$\forall x, y: \|\textrm{softmax} (x) - \textrm{softmax}(y)\|_{\infty} \leq L \|x - y \|_1 $$?
I have tried to invert the softmax function, but I am not sure what to do with the result or even if it is a useful thing to do.
I shall be grateful for any help in this task.
Edit: There is the property that
$$\forall c \in \mathbb{R}: \textrm{softmax}(x + c) = \textrm{softmax}(x),$$
where $x + c = (x_1 + c, \dots, x_n + c)$.
Then we need to find such $L$ that $$\|\textrm{softmax}(x) - \textrm{softmax}(y)\|_{\infty} \leq L \|(x + c_x) - (y + c_y) \|_1 = L \|x - y + c\|_1$$ for any $x, y \in \mathbb{R}^n, c \in \mathbb{R}$.
The right-hand side is minimal when $c = \textrm{median} \left(x_1 - y_1, \dots, x_n - y_n \right)$.
I am not sure, but maybe this can be more convenient for derivation?