Computationally more efficient technique than Singular Value Decomposition.

1.1k Views Asked by At

I am working on a mathematical project where I have to decompose a given matrix into two or more matrices. Presently I am using Singular Value Decomposition (SVD) for it. I came to know that SVD is a computationally inefficient technique (for $m \times n$ matrix its computational complexity is $\mathcal{O}(\min(m,n) m n)$). Can anyone please suggest any other technique which is more efficient that SVD.

I have to apply the decomposition algorithm in Signal Processing, i.e. given a signal in matrix form. Now I have to decompose it into two or more non negative matrices.

Edit. In case of singular value decomposition one of the factored matrix contains most of the information about the signal. I am also looking for non-negative matrix factorization technique in which one of the factored matrix contains most of the information about the given matrix.

2

There are 2 best solutions below

0
On

The singular value decomposition is computationally not feasible for many matrices. The reason for this is that the problem of computing the SVD of some matrix reduces to the eigenvalue problem for some other matrix. Given that the eigenvalues of a matrix are the roots of it's characteristic polynomial, taking the SVD reduces to finding the roots of a polynomial.

While this may seem simple, the Abel-Ruffini theorem tells us that there is no general formula for the roots of a polynomial with degree five or higher. Therefore, for all but the smallest cases, the eigenvalue problem can only be solved by iterative methods, and hence the same is true for finding the SVD of a general matrix.

However, many other matrix decompositions exist. If your matrix A is a square matrix, you can use QR factorization to factor your matrix such that A = QR, where Q is an orthogonal matrix and R is upper triangular. If A is invertible, this decomposition can be unique.

Lets assume A is invertible, i.e. its columns are linearly independent. Therefore, using the gram-schmidt process, we can construct an orthonormal basis from the columns of A, which form the columns of Q. R is constructed during this process, so that multiplication on the right by R transforms the orthonormal basis Q into the basis A. Stable and fast algorithms for this computation exist.

QR factorization in fact generalizes to non-square matrices, see the wikipedia article for details. The point is that QR decomposition should, in general, be faster, since it uses a different type of algorithm than the SVD.

1
On

The most promising direction you could look at is probabilistic matrix factorization; your matrices most surely have some nice property (Gaussian, sparse, low-rank because of correlation between signals etc), especially if they are coming from signal analysis.

A reference for these algorithms is the recent work by J.Tropp and his team, see in his talks page, especially these slides (« Finding structure in randomness », ICML'14). The main algorithms are quite simple to implement, just a couple of lines of Matlab/Octave/Python/.., basically you add a first layer of random column sampling in order to drop the complexity (here is an open-source example of implementation of these algorithms, e.g. Randomized Range Finder).

Under some hypotheses about your matrices, the complexity $\mathcal{O}(\min(m,n)m n)$ can be reduced to $\mathcal{O}((k+p) m n)$ for $k$ the numerical rank of your matrix, and $p$ a small oversampling parameter (usually $p \leq 20$ works perfectly well). So for « small rank » matrices you understand that we win a lot (if $k \ll \min(m,n)$).