Relation between Cholesky and SVD

25.3k Views Asked by At

When we have a symmetric matrix $A = LL^*$, we can obtain L using Cholesky decomposition of $A$ ($L^*$ is $L$ transposed).

Can anyone tell me how we can get this same $L$ using SVD or Eigen decomposition?

Thank you.

5

There are 5 best solutions below

2
On

or you use the LU decomposition.

Anyhow, you don't normally calculate the cholesky decomposition from the eigendecomposition or svd - you use gaussian elimination. See something like Matrix Computations.

0
On

Provided you can apply SVD (A is Positive Definite), it gives $$A = \sum \lambda_i v_i v_i^T$$ where $v_i$ is a unit eigenvector. This is because A is symmetric.

If you take $x_i = \sqrt{\lambda_i}v_i$, ($\lambda_i >0$ as A is PD). Then take $X = [x_i]$, i.e. each column of $X$ is one of the $x_i$. Then $$A = \sum x_i x_i^T = X X^T $$

(To prove that $\sum x_i x_i^T = X X^T$, use the block multiplication property, with each $x_i$ treated as a block)

In practice, it's probably faster to use Gaussian Elimination.

5
On

There is an interesting relationship between the eigen-decomposition of a symmetric matrix and its Cholesky factor: Say $A = L L'$ with $L$ the Cholesky factor, and $A = E D E'$ the eigen-decompostion. Then the eigen-decompostion of $L$ is $L= E D^{\frac{1}{2}} F$, with $F$ some orthogonal matrix, i.e. the Cholesky factor is a rotated form of the matrix of eigenvectors scaled by the diagonal matrix of sqaure-root eigen-values. So you can get $L$ from $E D^{\frac{1}{2}}$ through a series of orthogonal rotations aimed at making the elements above the diagonal zero.

2
On

I am not sure why anyone would want to obtain a Cholesky decomposition from a SVD or an eigen-decomposition, but anyway, let's say $A$ is positive definite:

  • As $A$ is positive definite, if $A=U\Sigma V^\ast$ is a SVD of $A$, we must have $U=V$ (exercise). Perform a QR decomposition for $\sqrt{\Sigma}U^\ast$, i.e. write $\sqrt{\Sigma}U^\ast=QR$ for some unitary matrix $Q$ and some upper triangular matrix $R$. Then $A=R^\ast R$ is a Cholesky decomposition of $A$.
  • If $A=PDP^{-1}$ is an eigendecomposition of $A$, perform a QR decomposition or Gram-Schmidt orthogonalization for each group of columns of $P$ that correspond to the same eigenvalue. Hence we can obtain a set of orthonormal eigenvectors of $A$, i.e. we get some unitary matrix $U$ such that $A=UDU^\ast$. So we can apply the previous method to obtain a Cholesky decomposition $A=R^\ast R$.
1
On

If you have the SVD of a positive semi-definite matrix you can easily rewrite this to $L L^*$. However, this isn't the $L$ the cholesky composition would have computed.

$\begin{align} A &= U\Sigma V^* && \textrm{SVD def.}\\ U &= V &&\textrm{since A is symmetric} \\ A &= \left(U \sqrt{\Sigma}\right) \left(U \sqrt{\Sigma}\right)^* && \textrm{note that $\sqrt{\Sigma}$ is easily computed as it's diagonal} \\ A &= L L^* && \textrm{with...}\\ L &= U \sqrt{\Sigma} \end{align}$

Though this isn't the same $L$ as cholesky computes (since it's not triangular), it does satisty $A = L L^*$ which may be enough to be useful to some.