Fast way to inverse B'CB+D

139 Views Asked by At

$\mathbf {A = B'CB}$, where $\mathbf A$ is of dimension $n \times n$, $\mathbf C$ is m by m, positive definite and symmetric, $\mathbf B$ is of dimension $m \times n$, and $n >> m$. Inversion of $A$ is computationally expensive. How can I take advantage of the fact that $\mathbf {A = B'CB}$ so that I can compute $\mathbf A^{-1} $ more quickly?

Furthermore, I want to know if there is a good way to inverse $\mathbf V$ where $\mathbf {V = B'CB + D}$ and $\mathbf D$ is a diagonal matrix. The economic interpretation of this formula is: $\mathbf C$ and $\mathbf V$ are covariance-variance matrices, $\mathbf B$ is $\beta$ loading matrix, $\mathbf D$ is residual variances. Same as above, the size of $\mathbf V$ $>>$ the size of $\mathbf C$.


Updated on Jul-21-2014 :
Thanks to @copper.hat I realized that $\mathbf {A = B'CB}$ is not invertible if B is not full rank. I'll edit my question as below:

I want to know if there is a good way to inverse $\mathbf V$, where

  • $\mathbf {V = B'CB + D}$,
  • $\mathbf V$ is of dimension $n \times n$,
  • $\mathbf C$ is of dimension $m \times m$, positive definite and symmetric,
  • $\mathbf B$ is of dimension $m \times n$,
  • $\mathbf D$ is a diagonal matrix, and the diagonal items are positive
  • $n >> m$.

Inversion of $\mathbf V$ is computationally expensive. How can I take advantage of the fact that $\mathbf {V = B'CB + D}$ so that I can compute $\mathbf V^{-1} $ more quickly?

The economic interpretation of this formula is: $\mathbf C$ and $\mathbf V$ are covariance-variance matrices. V is the cov-var matirx of all individual random variables. C is the cov-var matrix of some common factors. $\mathbf B$ is $\beta$ loading matrix, $\mathbf D$ is residual variances.


More updates:

Notice that when $m = 1$, we have $$ \mathbf {V^{-1} = D^{-1} - \left(\frac{1}{1+\kappa }\right) \hat B' C \hat B} $$ where $$ \kappa = \sum_{i=0}^N \left(\frac{\sigma^2 \beta_i^2}{\theta_i^2 }\right) $$ $$ \mathbf {B'} = \left(\beta_1, \beta_2, ... , \beta_N \right) $$ $$ \mathbf {\hat B'} = \left(\frac{\beta_1}{\theta_1^2 },\frac{\beta_2}{\theta_2^2 },...,\frac{\beta_N}{\theta_N^2 }\right) $$

$$ \mathbf C = \left( \begin{matrix} \sigma^2 \end{matrix} \right) $$ $$ \mathbf D = diag \left(\theta_1^2, \theta_2^2, ... , \theta_N^2 \right) $$

Is there a way to extend the formula above to cases where $m>1$ ?

1

There are 1 best solutions below

2
On BEST ANSWER

I solved a similar to the original problem recently. Unfortunately I cannot give you a solution because I am working in a commercial company and don't have right to publish my results. However I can give you few hints. 1. Try to build recursion. $V_n^{-1}=f(V_{n-1}^{-1})$. Here $V_n$ is $nxn$ matrix built from $V_{n-1}$ by adding 1 row and 1 column. 2. Use Woodberry identity for building recursion, namely

EDITED

$$ \left(D+B^TCB\right)^{-1} = D^{-1} - D^{-1}B^T \left(C^{-1}+BA^{-1}B^T \right)^{-1}B D^{-1}, $$ Playing with 1 and 2 I build an efficient way for inversion. More presicely I can calculate $x^TV^{-1}x$ and $\det V$ which are necessary for mulitvariate likelihoods. Since your model resemebles me a multi factor model of risk - may be you don't need $V^{-1}$ but these quantities.

Actually Woodberry identity even without any recursion indeed gives you a way to inverse your matrix much faster, since $D$ is diagonal and $C$ has much smaller dimension. You just have inverse in a similar form $$ V^{-1}=D^{-1}-\tilde B^T \tilde C \tilde B $$