Rouche-Capelli is a very well known theorem which classifies a linear equation system depending on the difference between the number of variables $m$ and the number of equations $n$. To yield an solution or multiple solutions, those cases must satisfy the rank equality between the coefficient matrix and the augmentated matrix.
Now recall that if $rk(A) = rk(A^*)=\min(m,n)$ we have an exact solution for $m>n$ and $m=n$ and infinite (or multiple solutions if $K=F_q^n$) if $m<n$.
Proving the rank equality can be done using the column and row space respectively. For the rest of this definition introduce the LHS matrix $A\in K^{n\times n}$, LHS vector $x \in K^n$ and the RHS vector $b \in k^n$. Which particularly describes the linear equation system in terms of:
$$Ax = b$$
Prove that $rk(A)=rk(A^*)$ effectively yields a solution using notation in column space. This is the same as stating that $b$ is a linear combination of columns in $A$ and the vector $\alpha \in K^n$. This is, $$b = \sum_{i=1}^{n} \alpha_i c_i$$ where $c_i$ is the $i$-th column in $A$. Obviously, such a coordinate vector $\alpha$ is equal to the solution vector $x$ then $\alpha = x$. In the case where $b$ cannot be generated, $rk(A^*)=rk(A)+1$ when $m>n$ and $m=n$
Now proving that $rk(A)=rk(A^*)$ yields a solution using the row space concept results in a different process. Find the linear combinations between rows and store them in a vector $v=(v_1,\ldots, v_m)$. Then $R_k = v_iR_i + v_jR_j$ where $R_k$ is the $k$-th row of $A$ which is a linear combination of rows $i,j$. Introducing the vector $b$ in the augmentation matrix changes the linear dependence condition to $b_k = v_ib_i + v_jb_j$.
A brief note is that computationally checking the rank equality is the same as finding the vector $x$ which is not very optimal when you first check the equality then proceed to solve the system. It'd result on finding $x$ twice. Please correct me if this claim is wrong.