Can every square matrix A be decomposed as a product of a projection P (where P isn’t trivially the Identity Matrix) and another Matrix N? Either written A=NP or A=PN, where P and A have the same range. And which conditions can be imposed on N, for this to be true? And does this hold more generally for non square matrices R? I am asking the question because I have been thinking about the following: Can I view every singular matrix as part of a projection and because of that it loses some of its dimension of its domain, and the determinant therefore becomes 0?
Edit: I just wanted to thank you guys for making me aware how to improve my questions. I am a third semester physics student in Vienna and had some questions which I didn’t know who to ask at my University, so I posted them here yesterday. Now I know I have to be more specific when writing my questions. I hope by asking more questions I well get better at formulating them and thank you for helping me to overcome my ignorance on this topic :)
As amWhy pointed out, your question is a little ill-posed, but I think that morally the correct answer to...
... is yes; for instance because of the singular value decomposition.
I thought that this followed more directly from a well-known decomposition, but I wasn't able to find it in a few minutes on Google, so I'll write a sketch here. For simplicity I will only address the square case, for which we can use the polar decomposition instead. The ideas are similar for the rectangular case, though. If you haven't already seen it, I would encourage you to look at this animation (shamelessly stolen from Wikipedia) before continuing.
My description will be very algebra-heavy but it is really just formalizing this geometry. You have to imagine that some of the "scaling" in the $\Sigma$ step is scaling by zero, i.e. just killing that "direction". The $\Sigma$ matrix here is very similar to the $\Lambda$ below, so this geometry is suggesting that that is where we should try to pop the projection out, and we just have to do some fiddling to get there.
The usual description of the (left) polar decomposition is $A=HU$, where $H$ is positive-semidefinite Hermitian and $U$ is unitary. If these terms are unfamiliar, you can think of:
I won't justify why every square matrix has a polar decomposition, but intuitively, it's for the same reason that the complex polar form $z=re^{i\theta}$ exists. Here $H$ is a matrix that acts kind of like a positive real number $r$; and it differs from $A$ by a unitary matrix, in much the same way that $r$ differs from $z$ by the rotation $e^{i\theta}$.
Now, you can't stop me from writing $$S\Lambda S^{-1} = (S\Lambda_{*\to 1} S^{-1})(S\Lambda_{0\to 1} S^{-1})$$ where $\Lambda_{*\to 1}$ is the diagonal matrix that is $\Lambda$ but with all nonzero entries replaced by $1$s, and $\Lambda_{0\to 1}$ the diagonal matrix that is $\Lambda$ but with all zeros on the diagonal replaced by $1$s. The equality holds because the inner $S$'s cancel and then it's just a quick check of the definitions that indeed $\Lambda_{*\to 1}\Lambda_{0\to 1}=\Lambda$. (Note that the nonzero elements are necessarily on the diagonal, but perhaps not all of the elements of the diagonal are nonzero; so $\Lambda_{*\to 1}$ looks like a "partial identity matrix".)
This little slight of hand finds us our projection: it's easy to verify that $\Lambda_{*\to 1}$ is a projection, and thus so is $S\Lambda_{*\to 1} S^{-1}$. (The "Hermetian" adjective means that this projection will be particularly nice, e.g. orthogonal when all the matrix entries are in $\Bbb{R}$, but again we're ignoring this.)
As a victory lap, let's write $P$ as shorthand for $S\Lambda_{*\to 1} S^{-1}$, so that we get:
$$ A = P(S\Lambda_{0\to 1} S^{-1})U $$
But notice that $S, U$, and $\Lambda_{0\to 1}$ are all invertible matrices, and so in fact $A$ is the product of a projection matrix and an invertible matrix.
Footnotes:
Jamie Radcliffe pointed out some other properties that you might want from your non-projection factor. I won't try to address all the variants, but I will note that there is also a right polar decomposition $A=UH$, and a very slight reorganization of the above argument thus can put the projection on the right side instead.
My gut is telling me that there is a more elementary argument if all you want is projection times invertible, but the invertible matrix is not quite Hermetian on account of all those eigenvalues. Maybe it's the orthogonality that is the bonus that you get from the machinery.