Does the singular value decomposition ever not work? The statement of the associated theorem, here from Wikipedia, is surprisingly general. Are there certain conditions of the matrix $M$ that would make the decomposition fail to exist? The matrix M in question is a random matrix that is point-wise convergent.
Thanks so much!
Every real matrix has an SVD with real entries, every complex matrix has an SVD with complex entries. There are no further conditions.