For example, suppose we want to solve for the eigenvectors of:
$$A = \begin{bmatrix} 0 & 1 \\ 2 & -1 \end{bmatrix} $$
We quickly find the eigenvalues are $1, -2$ i.e. $\sigma(A) = \{1, -2\}$
Then $Av_1 = \lambda_1 v_1$
$$ \begin{bmatrix} 0 & 1 \\ 2 & -1 \end{bmatrix} \begin{bmatrix}v_{11} \\ v_{12} \end{bmatrix} = \begin{bmatrix} -2v_{11} \\ -2v_{12} \end{bmatrix}$$
We get two equations:
$v_{12} = -2 v_{11}$
$2v_{11} - v_{12} = -2 v_{12}$
Solving either equation will yield equivalent eigenvectors i.e. $v_1 = [1 -2]^T$.
What account for this property? Why is it that we only need to check for one equation in this case?
Are there conditions or criteria for when we should check all equations instead just a single one?
Your question traces back into how one finds an eigenvector of a square matrix ${\bf{A}}$. Suppose that we are looking for nonzero vectors ${\bf{x}}$ called eigenvectors of ${\bf{A}}$ such that
$${\bf{Ax}} = \lambda {\bf{x}}\tag{1}$$
rewrite the equation in the form
$${\bf{Ax}} = \lambda {\bf{Ix}}\tag{2}$$
where ${\bf{I}}$ is the identity matrix. Now, collecting terms to the left side we can get
$$\left( {{\bf{A}} - \lambda {\bf{I}}} \right){\bf{x}} = {\bf{0}}\tag{3}$$
As you see, this is a system of linear algebraic equations. What is the requirement for this system to have nonzero solutions? Yes! the determinant of coefficient matrix should vanish which means
$$\det ({\bf{A}} - \lambda {\bf{I}}) = 0\tag{4}$$
this is the main equation that you find the eigenvalues from it. So, eigenvalues set the determinant of ${{\bf{A}} - \lambda {\bf{I}}}$ to zero. When the determinant of this matrix is zero, it implies that the equations in (3) are linearly dependent. In your example, your two equations are linearly dependent as you can easily verify that they are really the same
$$\left\{ \begin{array}{l} {v_{12}} = - 2{v_{11}}\\ 2{v_{11}} - {v_{12}} = - 2{v_{12}} \end{array} \right.\,\,\,\,\,\,\,\, \to \,\,\,\,\,\,\,\left\{ \begin{array}{l} 2{v_{11}} + {v_{12}} = 0\\ 2{v_{11}} - {v_{12}} + 2{v_{12}} = 0 \end{array} \right.\,\,\,\, \to \,\,\,\,\,\left\{ \begin{array}{l} 2{v_{11}} + {v_{12}} = 0\\ 2{v_{11}} + {v_{12}} = 0 \end{array} \right.\tag{5}$$
In conclusion, I might say that you will never use all equations in $(2)$ or $(3)$. If ${\bf{A}}$ is a $n \times n$ matrix, then you have to use $n-1$ or less equations.