I'm struggling to solve this question without the use of the determinant (I'm not allowed to use it). I've tried setting up a matrix with the vectors and putting that matrix into reduced row echelon form and solving that, but no matter what I do I always end up with a value of $a$ that, when substituted back into the vectors, results in a matrix that is linearly independent.
I've also tried considering other properties of linearly dependent vectors, such as the fact $2$ of those vectors are linearly dependent if they are scalar multiples of each other or that one of them is a linear combination of the others, but I am still unable to produce a valid value for $a$. Any help would be appreciated, thank you in advance.
My attempt: matrix in rref $$ \begin{pmatrix} 1 & 0 & a-1 \\ 0 & 1 & 1 \\ 0 & 0 & -3a-1 \\ \end{pmatrix} $$ I believe -3a -1 = 0 should yield the correct solution, but I seem to be wrong and I'm not quite sure why.
Suppose there was a relation so that $xv_1+yv_2=v_3$ (this is the definition of linear dependence). From the second entries in the vectors, you know $y$ must be $1$. So for the first entry you have $x+1=a$ and for the third entry you have $2x+a=-1 \implies 2x+1=-a$. Adding the two results, $3x+2=0 \implies x=\frac{-2}{3}$. Plugging this back in we have $\frac{-2}{3}+1=a \implies a=\frac{1}{3}$.