Let's say that I'm taking measurements from a real-world system, and I have acquired a set of vectors:
Vector 1: $[1.1, 2.4, 2.0]$
Vector 2: $[0.3, 1.2, 1.8]$
Vector 3: $[1.5, 3.6, 3.9]$
In this case, Vector 3 is almost a linear combination of Vector 1 and Vector 2 (i.e. if we add Vector 1 and Vector 2, we get $[1.4, 3.6, 3.8]$, which is pretty close to Vector 3). That being said, if we are given three vectors, is there any way to determine that one vector is approximately the linear combination of any of the others? Is there a way to generalize this to $N$ vectors?
If we want to define "approximately", we could say that as long as the total difference between the estimate vector (the vector that was generated as a linear combination of the other vectors) and the true vector was less than some error term, then the true vector is approximately a linear combination of the other vectors. This is one suggestion, but am open to other interpretations of "approximately".
You can compute the projection of each vector onto the subspace spanned by the other two vectors, then use this to compute the distance of each vector to the subspace. If the distance is small then you have an approximate linear combination.
In general if you have a ton of vectors and want to study the extent to which they're approximately linearly dependent you can put them all in a big matrix and compute its singular value decomposition. The significance of this is that truncating the singular value decomposition at the first $k$ singular values provides the closest rank-$k$ approximation (in either the operator or Frobenius norm) to your matrix, so if your matrix has a bunch of small singular values that indicates that it is well approximated by this low rank truncation, which indicates many approximate linear dependencies, given by the singular vectors associated to the smallest singular values.
For example, a quick computation in Octave says that the matrix consisting of those vectors (I put them in as row vectors) has singular values approximately $6.8, 0.63, 0.059$, so in fact your matrix is pretty close to being rank $1$. The left singular vector associated to the smallest singular value is
$$u_3 \approx \left[ \begin{array}{cc} -0.57 \\ -0.59 \\ 0.57 \end{array} \right]$$
which reveals, as you say, that the sum of the first two vectors is approximately the third.