I know that if I want to prove a set is linearly independent, then I should prove that the equation $c_1v_1 + c_2v_2 +...+c_kv_k = 0$ implies $c_1 = c_2 = ... = c_k = 0$.
However, if I am asked to determine whether a set S is linearly independent or dependent, should I do the same but show that they mustn't all be 0's (i.e. infinitely many solutions) or should I show that I have a vector that is a linear combination of two others?
Both ways are theoretically correct, but in practice, you usually have a basis so you can write every vector $v_i$ as a column of numbers $(b_{1i},\dots b_{ni})^\top$ so that the condition to check linear independence is a single homogeneous system of linear equations $B\,c=0$.
Then you get that system solved by your favourite method (Gauss is a good candidate) and you obtain a definite answer: either the only solution is $c=0$ (thus the set is linearly independent) or there are infinite solutions (hence the set is linearly dependent).