Let $n$ be a large positive integer and define $\Delta:\mathbb R^n \to \mathbb R$ by $\Delta(x) := \max_{i,j} x_j - x_i$. Let $X_1,\ldots,X_n$ be iid from $N(0,1)$
Question. What is a good concentration inequality for $\Delta(X)$ ?
Observation
Let $x'$ and $x$ be vectors in $\mathbb R^n$ which only dependend on their $k$th coordinate. Then
$$ \begin{split} |\Delta(x') - \Delta(x)| &= |(|\max_{i,j} x'_j-x'_i|) - (|\max_{i,j} x'_j-x'_i)| \le \max_{i,j} \left||x'_j-x'_i|-|x_j-x_i|\right|\\ &\le \max_{i,j}|(x_j'-x_j)-(x_i'-x_i)| = \begin{cases}0, &\mbox{ if }k \not\in\{i,j\},\\ |x'_i-x_i|,&\mbox{ if }i = k \ne j,\\ |x'_j-x_j|,&\mbox{ if }j = k \ne i.\end{cases} \end{split} $$ Therefore $|\Delta(x') - \Delta(x)| \le |x'_k-x_k|$, and so $\Delta$ is $1$-Lipschitz in every coordinate and it follows from Corollary 2.1 of this monograph that
$$ \mathbb P(|\Delta(X) - \mathbb E[\Delta(X)]| \ge t) \le 2e^{-t^2/(2n)}. $$
I hope for an inequality with a better dependence on the dimension $n$.
The functions $v \mapsto \max_i v_i$ and $v \mapsto \min_i v_i$ are $1$-Lipschitz with respect to the Euclidean metric. Indeed, take any $x,y \in \mathbb{R}^n.$ In the case that $\max_i x_i > \max_j y_j,$ $$ | \max_i x_i - \max_j y_j| = \max_i x_i - \max_j y_j \le \max_i (x_i - y_i) \le \max_i |x_i - y_i| \le \|x-y\|,$$ and the same argument reversed works if $\max_i y_i > \max_i x_i.$ The claim follows for $\min$ because $\min_i v_i = - \max_i (-v_i).$
The upshot of this is that $\max_{i \neq j} x_i -x_j = \max_i x_i - \min_j x_j $ is $2$-Lipschitz with respect to the Euclidean metric. Now using theorem 2.2 from the notes linked to in the question gives a concentration radius independent of $n$.