I can prove that the function: $$f(\tau, x_1, x_2, ..., x_N) = -\tau \log \frac{1}{N} \sum_{i=1}^{N} \exp{\left(-\frac{x_i}{\tau}\right)} $$ converges to $\min(x_1, x_2, ..., x_N)$ for $x_i \geq 0$ as $\tau \to +0$ using the L'Hospital's rule (by substitution $\tau=\frac{1}{\rho}$ and finding the limit $\rho \to +\infty$)
However, I need also to find the upper bound of the approximation error: $$\left| f(\tau, x_1, x_2, ..., x_N) - z \right| \leq h(\tau, x_1, x_2, ..., x_N)$$ where $z=\min(x_1, x_2, ..., x_N)$.
Using Jensen's inequality and fact that $-\log(y)$ is convex I can show that \begin{equation} \begin{split} \left| f(\tau, x_1, x_2, ..., x_N) - z \right| &= -\tau \log \frac{1}{N} \sum_{i=1}^{N} \exp \left( -\frac{x_i - z}{\tau} \right) \leq \\ &\leq -\frac{\tau}{N} \sum_{i=1}^{N} \log \exp \left( -\frac{x_i - z}{\tau} \right) = \bar{x} - z, \end{split} \end{equation} where $\bar{x}=\frac{1}{N}\sum x_i$. However, this bound doesn't depend on parameter $\tau$.
I wonder if there any sharper upper bound, which depends on $\tau$?
We may rearrange the $x_i$ so that they are sorted $x_1 \leq x_2 \leq \cdots \leq x_N$. Then \begin{align*} f(\tau, x_1, \dots, x_N) &= -\tau \ln \left( \frac{1}{N}\sum_{i=1}^N \mathrm{e}^{-x_i/\tau} \right) \\ &= -\tau \left( \ln\left( \frac{1}{N}\mathrm{e}^{-x_1/\tau} \right) + \ln \left( 1 + \sum_{i=2}^N \mathrm{e}^{(x_1-x_i)/\tau} \right) \right) \\ &= -\tau \left( -\ln(N) - \frac{x_1}{\tau} + \ln \left( 1 + \sum_{i=2}^N \mathrm{e}^{(x_1-x_i)/\tau} \right) \right) \\ &= \tau \ln(N) + x_1 - \tau \ln \left( 1 + \sum_{i=2}^N \mathrm{e}^{(x_1-x_i)/\tau} \right) \text{.} \end{align*} Therefore, $$ |f(\tau, x_1, \dots, x_N) - x_1| = \left| \tau \ln(N) - \tau \ln \left( 1 + \sum_{i=2}^N \mathrm{e}^{(x_1-x_i)/\tau} \right) \right| \text{.} $$ For $\tau$ sufficiently small, the sum expression in the the parentheses is $\varepsilon \ll 1$, so \begin{align*} |f(\tau, x_1, \dots, x_N) - x_1| &= \left| \tau \ln(N) - \tau \ln \left( 1 + \varepsilon \right) \right| \\ &= \tau \left| \ln(N) - \left(\varepsilon + O(\varepsilon^2) \right) \right| \\ &\approx \tau \ln N \text{.} \end{align*}
Visually inspecting graphs of $f$ versus $\tau$ for several choices of $N$ and $x_i$, this does capture the near zero behaviour.