I'm considering a situation where vectors are nearly linearly dependent.
Suppose that $A$ is $2 \times 2$ matrix
$A= \left[ \begin{array}{cc} 1 & a \\ a & 1 \end{array} \right] $
If $a=1$, $A$ is of course singular and linearly dependent. But if $a=0$ and $a=0.999$, $A$ is non-singular for both cases. But, apparently $a=0.999$ is more close to being singular or linearly dependent than $a=0$. In this $2 \times 2$ case, the determinant can be used to measure the closeness to being singular or linearly dependent.
Now consider $3 \times 2$ matrices $B_1$ and $B_2$
$B_1 = \left[ \begin{array}{cc} 1 & 0 \\ 1 & 1 \\ 1 & 2 \end{array} \right] $
$B_1 = \left[ \begin{array}{cc} 1 & 0.999 \\ 1 & 1 \\ 1 & 1.001 \end{array} \right] $
Mathematically, both $B_1$ and $B_2$ are linearly dependent. But, $B_2$ is more close to being linearly dependent than $B_1$.
My question is are there any ways to measure the closeness to being linearly dependent for such non-square matrix. In other words, how can I demonstrate that $B_2$ is close to being linearly dependent than $B_1$.
Definitely. For square matrices, just calculate the determinant: if the result is close to zero, then the columns of your matrix are close to be linearly dependent (though, strictly speaking, they are not exactly linearly dependent). For a non-square matrix $B$, the columns are nearly dependent if the determinant of $B^TB$ is nearly zero.