Factorize positive definite symmetric matrix

2.4k Views Asked by At

Let's start from the assumption of disposal of a positive definite symmetric matrix of size $\ (N,N) $. For some reason I have to factorize this matrix: I am already aware of the Cholesky factorization method, but I am wondering if there is any procedure allowing for matrix factorization based on eigen-vectors and eigen-values.

I am not a mathematician, therefore my background knowledge on the topic is rather scarce and little. I need the factorization within a MATLAB code, and the Cholesky factorization is quite time-expensive, so that I'd more than appreciate moving towards a more practical (but effective) way to compute the lower triangular matrix.

Any support will be more than welcome.

3

There are 3 best solutions below

7
On BEST ANSWER

Edit: Another decomposition which involves an upper triangular matrix (therefore a lower triangular one if you apply it to the adjoint first) is the QR decomposition, which is based on Gram-Schmidt. Of course, there is also the general $LU$ decomposition, but Cholesky is twice as efficient. I don't know how the efficiency of QR compares to the one of Cholesky.

Now here is my previous answer, which did not answer the question...

Since you said symmetric, I assume your matrix is real.

Every positive definite symmetric (real, square) matrix is diagonalizable in an orthonormal basis.

Thus there exists $P$ invertible (and actually, $P$ can be taken to be an orthogonal matrix, wich means $P^tP=PP^t=I_n$) such that $$ A=PDP^{-1}=PDP^t $$ where $D$ is diagonal and the diagonal coefficients of $D$ are the eigenvalues of $A$, which are all positive.

Note that $P$ is simply a matrix whose columns constitute an orthonormal basis made of eigenvectors.

The same is true for positive definite complex matrices (they are not necessarily symmetric, they are self-adjoint or hermitian). Just replace orthogonal by unitary (ie $P^*P=PP^*=I_n$) and $P^t$ by $P^*$.

0
On

Edit : Did not see the edit above, my answer provides comparaison between Choleski and QR:

Once you have $A= ( P D^{1/2} P^t) (P D^{1/2} P^t)^t$ you can perform $QR$ decomposition (using Householder for instance) on $B=(PD^{1/2}P^t)^t=QR$ Then since $A=B^tB$, you get :

$$A = R^tQ^tQR = R^tR$$ and thus your factorisation.

However if memory serves, Householder is more expensive than Choleski, and there isn't any (known) way to compute $QR$ faster than $4n^3/3$ operations (With Givens, Householder or Choleski), whereas Choleski costs $n^3/3$. So even if that technically works, I'm afraid it won't be of much help.

PS : using Gram Schmidt to compute QR is easier, but awful in terms of complexity, and numerical stability : $2n^3$ operations. The fact that computing $B^TB$ and using Choleski is one of the fastest way to compute $QR$ (but not great numerically) is a tell that QR probably isn't the answer.

6
On

I wanted to comment on the LU stuff above, but i'm pretty new around here and i didn't know how.

LU is supposed to be less efficient than Choleski, by a factor 2.

As for the square root problem with Choleski, it is possible to compute $D$ and $U$ such that $R=\sqrt{D}U$ and $A=R^tR$ with a modified version of Choleski :

Input : A symmetric definite positive matrix $A\in \mathbb R^{n\times n} $.
Output : A triangular matrix $R$ with positive diagonal coefficients such that $A=R^T\cdot R$.

$U:=A$;
For{$\kappa := 1$ To $n$}{
$~~$For{$i:=1$ To $\kappa-1$}{
$~~~~$For{$j:=1$ To $i-1$}{
$~~~~~~$$u_{i,\kappa} := u_{i,\kappa} - \mu_{j,i} u_{j,\kappa}$;
$~~~~$}
$~~~~$$\mu_{i,\kappa} := \frac{u_{i,\kappa}}{u_{i,i}}$;
$~~$}
$~~$$u_{\kappa,\kappa} :=u_{\kappa,\kappa} -\sum_{i=1}^{\kappa-1} \mu_{i,\kappa} u_{i,\kappa}$;
$~~$$d_{\kappa} := \frac{1}{u_{\kappa,\kappa}}$;
}
Return $R=\sqrt{diag\left(\overrightarrow{d}\right)}\cdot Trig(U)$.