Any $m\times n$ matrix has a singular value decomposition, $$A = USV^T\text{,}$$ where $A$ is $m\times n$ matrix, and its singular values are $(\sigma_1,...,\sigma_m)$. There is a method to judge how singular $A$ is by computing the ratio of the largest singular value $\sigma_i$ to smallest singular value $\sigma_j$, $$c = \frac{\sigma_i}{\sigma_j}\text{.}$$ AFAIK, if $A$ is a square matrix, then $A$ is singular if and only if $\det(A) = 0$, which means it's not invertible, and if $A$ is not square, $A$ is singular if it has $zero$ singular values.
So, how could the ratio $c$ be used to judge how singular $A$ is? If $c = \infty$, it is singular?
I'll use $\| \cdot \|$ to denote a vector norm and also the corresponding induced matrix norm.
Suppose $Ax = b$ and $A(x + \Delta x) = b + \Delta b$. So \begin{align} & A \Delta x = \Delta b \\ \implies & \Delta x = A^{-1} \Delta b \\ \implies & \| \Delta x \| \leq \| A^{-1} \| \, \| \Delta b \| \\ \implies & \frac{\| \Delta x \|}{\|x\|} \leq \frac{\| A^{-1} \| \, \| \Delta b \|}{\|x\|}\\ \implies & \frac{\| \Delta x \|}{\|x\|} \leq \frac{\| A^{-1} \| \, \| \Delta b \|}{\|x\|} \frac{\|b\|}{\|b\|} \leq \frac{\| A^{-1} \| \, \| \Delta b \|}{\|x\|} \frac{\|A\|\|x\|}{\|b\|} \\ \implies & \frac{\| \Delta x \|}{\|x\|} \leq \|A\| \|A^{-1}\| \frac{\|\Delta b\|}{\|b\|}. \end{align}
If the condition number $c = \|A\| \|A^{-1}\|$ is large, then the relative error in $x$ may be large even if the relative error in $b$ is small.
If $\| \cdot \|$ is the 2-norm, then $\|A\| = \sigma_{\max}$ (the maximum singular value of $A$) and $\|A^{-1}\| = \frac{1}{\sigma_{\min}}$ so \begin{align} c &= \|A \| \|A^{-1}\| \\ &= \frac{\sigma_{\max}}{\sigma_{\min}}. \end{align}