If through simple row operations such as adding one row to another I can produce a row of entirely zero does this mean that the matrix will be singular?
For example:
$$\left\{ \begin{matrix} 1 & 2 & 3 & 4 & 5 \\ 0 & 1 & 2 & 3 & 4 \\ 2 & 3 & 4 & 4 & 0 \\ 5 & 4 & 3 & 2 & 1 \\ 1 & 5 & 4 & 2 & 3\end{matrix} \right\}$$
Can be turned into the following by adding the first row to the fourth row
$$\left\{ \begin{matrix} 1 & 2 & 3 & 4 & 5 \\ 0 & 1 & 2 & 3 & 4 \\ 2 & 3 & 4 & 4 & 0 \\ 6 & 6 & 6 & 6 & 6 \\ 1 & 5 & 4 & 2 & 3\end{matrix} \right\}$$
Which can then be turned into the following by subtracting the second row from the first
$$\left\{ \begin{matrix} 1 & 1 & 1 & 1 & 1 \\ 0 & 1 & 2 & 3 & 4 \\ 2 & 3 & 4 & 4 & 0 \\ 6 & 6 & 6 & 6 & 6 \\ 1 & 5 & 4 & 2 & 3\end{matrix} \right\}$$
Now by subtracting the first row from the fourth 6 times we will produce a row of entirely $0$'s like so
$$\left\{ \begin{matrix} 1 & 1 & 1 & 1 & 1 \\ 0 & 1 & 2 & 3 & 4 \\ 2 & 3 & 4 & 4 & 0 \\ 0 & 0 & 0 & 0 & 0 \\ 1 & 5 & 4 & 2 & 3\end{matrix} \right\}$$
Since we have managed to produce a row of $0$'s does this mean that the rank of this matrix after further row operations is guaranteed to be less than 5 so according to the Invertible Matrix Theorem it should not be invertible/will be singular?
Is this logic sound or is their some other way we can determine that a large matrix is non invertible without doing intensive calculations?
You're making this too complicated if your question is just about invertibility. Just compute the determinant. If either a row or a column of a matrix is entirely made of 0's, then the determinant will be 0 and by the equivalence theorem of linear algebra, the matrix is singular(noninvertible) . Of course, this will only work by hand if the matrix is relatively small-otherwise, you'll need a computer algebra program.
Interestingly, the same result is obtained if your algorithm produces 2 identical rows in a square matrix. Indeed, the result is equivalent to the previous one-just add the negative of one of the 2 rows to the other! You might think that this is just a computational quirk of square matrices, but it has far reaching consequences. To give just one-it's where the Pauli Exclusion principle of quantum mechanics comes from. When one expresses the wavefunction of fermionic particles via the Heisenberg matrix formulation, one can express it via the Slater determinant. If 2 fermionic particles-like electrons in the outer d and higher shells of an atom--occupy the same quantum states,then 2 rows or columns of the resulting matrix are identical. But then when one applies the Slater determinant to compute the probability value of the square of a given wavefunction, the deteminant vanishes. This means the value of the probability of finding an electron in a given state is 0 and this means it's impossible!
To find out more about this relationship, see any basic book on quantum mechanics or chemistry, like Ira Levine's very good book.