Rouché–Capelli theorem (Kronecker–Capelli theorem/Rouché–Fontené theorem/Rouché–Frobenius theorem/Frobenius theorem) states that for the non-homogeneous system Ax = b,
$(i)$ $Ax = b$ has a unique solution if and only if $rank[A] = rank[A|b] = n$
$(ii)$ $Ax = b$ is inconsistent (i.e., no solution exists) if and only if $rank[A] < rank[A|b]$
$(iii)$ $Ax = b$ has infinitely many solutions if and only if $rank[A] = rank[A|b] < n$
How do I derive these conditions ?
My Understanding
$(i)$ $Ax=b$ has a unique solution
$A^{-1}$ exists $\implies \boxed{x=A^{-1}b} \implies |A|\neq 0 \implies rank(A)=n$
If solution exists, then $\vec{A}_1x_1+\vec{A}_2x_2+....+\vec{A}_nx_n=b\implies b$ is a linear combination of the column vectors
$\implies rank[A|b]=rank[A]=n$
$(ii)$ $Ax=b$ is inconsistent (i.e., no solution exists)
$\boxed{|A|x=(adj A)b\implies |A|=0 \quad\&\quad adj A.b\neq 0}$
$|A|=0\implies rank[A]<n$
Is there a way to prove the last two conditions of the Rouché–Capelli theorem ?
[I'll assume the field in which we are working to be $\mathbb{R}$]
2) The system is inconsistent iff there aren't $x_1,...,x_n$ such that, $$A_1x_1+....+A_nx_n=b$$.
This happens iff when we add $b$ as a column to the matrix $A$ the rank increases by 1($b$ is linearly independent from the other columns).
3) Clearly there are solutions iff there are $x_1,...,x_n$ such that: $$A_1x_1+....+A_nx_n=b$$
This happens iff when we add $b$ as a column to the matrix $A$ the rank doesn't increase($b$ is linearly dependent). So we have $\text{rank}(A)=\text{rank}(A|b)$. Now we have to prove when the solutions are infinite. The simplest proof that I know relies on affine spaces, but I don't know if you are familiar with it, so I'll stay on linear systems. If you apply Gaussian Elimination to $A$ you'll obtain a scale matrix with $p$ pivots. The number of pivots of a scale matrix is equal to its rank(and row reduction doesn't modify the rank of $A$), so $rank(A)=p$. Now the "pivot-columns" will correspond to unknowns and "non-pivot-columns" to parameters Essentially the set of solutions will depend upon $n-p=n-\text{rank}(A) $ parameters. The solutions are infinite if and only if there are parameters so if and only if $n-\text{rank}(A)>0$ so $\text{rank}(A)<n$.
Your proof of the first point is wrong if the matrix is allowed to be rectangular. In that case a correct proof would be noticing that a consistent system has an unique solution iff there aren't paramenters, so if and only if $n-\text{rank}(A)=0$ which implies $n=\text{rank}(A)$
Here I write a more advanced(and elegant) proof of the last proposition that relies on affine spaces. If a linear system is consistent than its solution are an affine subspace of $\mathbb{R}^n$, whom direction is the set of solutions of the homogeneous associated system. An affine subspace has more than one element iff its dimension is non-zero This happens iff the dimension of its direction is non-zero. The dimension of the direction is by definition $\dim(\ker(A))$. By the rank-nullity theorem:
$$\dim(\ker(A))=n-\text{rank}(A)$$ We want this dimension to be non-zero, so: $$n-\text{rank}(A)>0$$ $$\text{rank}(A)<n$$