How do I distinguish a zero matrix from a matrix of small norm?

71 Views Asked by At

Let $A$ be an $n \times n$ positive semidefinite hermitian matrix of low rank – say $< \frac{1}{10} n$. Let $M$ be an $n \times m$ matrix with orthonormal columns. I wish to compute a basis for the null space of $M^\dagger A$. The usual way of doing this is to use SVD on $M^\dagger A$ and treat singular values below a cutoff as zero, but:

Problem. What is the appropriate cutoff? How do I distinguish $M^\dagger A$ from the zero matrix?

In my situation the matrix $M$ is not given; rather, it is the result of another computation, so the values of $M$ are not exact in the first place. Since the rank of $A$ is low, there is a high chance that $M^\dagger A = 0$ in exact arithmetic, but numerical computation yields a non-zero matrix. The computed singular values vary significantly in magnitude, so using the usual cutoff of $n \epsilon$ times the largest singular value makes $M^\dagger A$ appear to have non-zero rank. Using the largest eigenvalue of $A$ instead of the largest singular value of $M^\dagger A$ doesn't make much of a difference.

On the other hand, taking this at face value results in an inconsistency: since $A$ is positive semidefinite, the column space of $M$ plus the null space of $M^\dagger A$ should span the whole vector space, but if I set the cutoff too low then the computed null space of $M^\dagger A$ ends up too small. But this also doesn't make much sense – the large singular values of $M^\dagger A$ should correspond to, if anything, vectors already in $M$, so leaving them out of the null space shouldn't actually matter for the purposes of obtaining a decomposition of the whole vector space.