This post appears to answer my question. However, it seems that I misunderstand the answer and or problem it solves. (I try to keep the notation consistent w.r.t to post I am referring to)
Suppose I have two polytopes in $R^n$, i.e., $X=\{x\mid Ax\leq b\}$ and a slightly larger one scaled by $\delta$, $X_\delta=\{x\mid Ax\leq b + \delta u\}$, where $u$ is a vector of ones and $A\in\mathbb{R}^{m\times n}$.
Then, the result bounds the Hausdorff distance by: $$\frac{\delta\sqrt{n}}{\sigma_{max}(A)} \leq d_H(X,X_\delta)\leq \frac{\delta\sqrt{m}}{\sigma_{min}(A)}.$$
However, this does not work for my example with: $$ A = \begin{bmatrix}1 & 0 \\ -0.01 & 1\\ -0.01 &-1\end{bmatrix},\ b = \begin{bmatrix}1\\1\\1\end{bmatrix}.$$ Here, the upper bound is: $\frac{\delta\sqrt{3}}{1.0001}$ and the vertices of $X$ are: $\{(1,1.01),(1,-1.01),(0,-100)\}$. For $X_\delta$ with $\delta = 0.1$ the vertices are approximately: $\{(1.1, 1.11),(1.1,-1.11),(0,-110)\}$. Clearly, there are two vertices separted with a distance of $10$, not $\frac{0.1\sqrt{3}}{1.001}$ as the upper bound would suggest.
What am I doing wrong here?