I am studying the GS procedure and was told that we can use GS to detect linear dependence of the input set. More specifically, GS will produce a basis that is of less size that the original spanning set. My question is, how do you know when to stop? If the $\mathbf{u_i}$ produced on the ith iteration is $\mathbf{0}$, then does this imply that $\mathbf{v_i}$ is redundant with the previous $\mathbf{v_i}$'s (and therefore linearly dependent)? Maybe we should check, at each iteration, that $\mathbf{u_i}$ is indeed orthogonal to the previous $\mathbf{u_i}$'s? Or is this not necessary?
If we just need to find when $\mathbf{u_i}=0$, should we stop GS right then and there and extract the basis? Or do we continue on as normal?
Thanks!