Find diagonal of inverse matrix

11k Views Asked by At

I have computed the Cholesky of a positive semidefinite matrix $\Theta$. However, I wish to know the diagonal elements of the inverse of $\Theta^{-1}_{ii}$. Is it possible to do this using the Cholesky that I have computed? Or will finding the eigenvalues alone (without the orthonormal matrices of a SVD) help this cause? Are there any other suggestions or alternative decompositions that will aid finding the inverse matrix diagonal?

I've seen that random projections does wonders for inverting matrices. Could something like this be applied here?

4

There are 4 best solutions below

5
On

If you have the Cholesky decomposition, you can easily compute the whole matrix inverse. Since

$$\Theta = R^* R$$

where $R$ is upper-triangular, then you can find $\Theta^{-1}$ by solving

$$R^* R X = I$$

where $I$ is the identity. The latter system can be solved by forward and backward substitution.

If you only want the diagonal entries of $X$, you could save perhaps half the computation by stopping the backward substitution process (for each column) when you get to the diagonal entry.

0
On

I think that we cannot have a complexity $<n^3/3$. Indeed let $\Theta=LL^*$ where $L$ is lower triangular. Clearly $\Theta^{-1}={L^*}^{-1}L^{-1}$ ; let $L^{-1}=[u_{i,j}]$. Then (F) ${\Theta^{-1}}_{i,i}=\sum_{j\geq i}|u_{j,i}|^2$. Then we must know the $(|u_{j,i}|)$ and (in my opinion) we must know the $(u_{j,i})$, that is $L^{-1}$. The complexity of the calculation of $L^{-1}$ is $\sim n^3/3$ (cf. A Fast Triangular Matrix Inversion by R. Mahfoudhi). Along the second step, the calculation of the $({\Theta^{-1}}_{i,i})$ (with (F)) is in $O(n^2)$.

0
On

Tang and Saad have a method that uses random vectors (not necessarily projections):

A Probing Method for Computing the Diagonal of the Matrix Inverse

4
On

I stumbled onto this question when trying to answer a similar question

I want a diagonal matrix that best approximates the inverse of a matrix ${\bf B} \succ 0$.

I'll post my answer to that question in case it helps other (and maybe OP). In this case, "best" means nearest in the $\ell_2$ sense.

$$\textbf{d}^*(\textbf{B}) = \operatorname{argmin}_{\textbf{d}} \tfrac 1 2 \| \textbf{B} \operatorname{diag} (\textbf{d}) - \textbf{I} \|_F^2$$

This is separable in $d_i$ and differentiable. Setting the gradient to zero brings us to the closed form (and very cheap) solution

$$[\textbf{d}^*]_i = \frac {b_{ii} } { \| \textbf{b}_i \|^2}$$

(Note in complex numbers, you'd need to conjugate)

I wouldn't be surprised if this has been known for 100 years, but I couldn't easily find it.