Time complexity of finding nullspace of a matrix

2.3k Views Asked by At

The problem is finding the nullspace of a singular $n \times n$ square matrix $A$ (or alternatively computing the eigenvectors corresponding to the eigenvalue 0).

What is the algorithm with the lowest time complexity of finding the nullspace.

e.g. for SVD this is $O(n^3)$

1

There are 1 best solutions below

4
On BEST ANSWER

You're definitely going to need a rank-revealing factorization, like RRQR or RRLU. A straight QR or LU will not accomplish this; you need a column pivoting strategy. The literature is diverse on this topic, and I'll be honest I don't know which algorithm is considered the best at this point---perhaps because there is not a consensus!

For a complexity measure, I'll offer one example from "Rank Revealing QR Factorizations" by Tony F. Chan, Yale University, 1985 (PDF here). The algorithm he presents achieves a complexity of $2n^3/3+4n^2r$ flops, where $r$ is the rank. In contrast, he quotes a complexity of $6n^3$ for the SVD.

EDIT: I was thinking about this more, and I believe that the TRP variant of LUSOL is likely to be one of the more efficient readily-available algorithms out there. Do note however that it is designed for sparse matrices.

For dense matrices, I suspect this paper comes close to providing the best choice, a rank-revealing LU factorization. Here is a paper that builds upon the work of Pan but is freely available.

Finally, note the theoretical complexity bound of LU factorization matches that of matrix multiplication. Hence there exist, in theory, methods for LU factorization that achieve, say, $O(n^{2.5})$ complexity and better. Whether anyone has actually used one in an RRLU setting I do not know (but I doubt).