Are there fast methods to factor a covariance matrix?

35 Views Asked by At

I have an covariance matrix $\hat C$ (squared, symmetric, positive semidefined). I would like to approximate it by the following expression:

$ \hat C = \hat I \cdot \vec \gamma^2 + \hat \beta \cdot \hat \beta^T $

In the above expression $\hat I$ is a unitary matrix, $\vec \gamma$ is a column vector, $\vec \beta$ is a $n$ by $k$ matrix, where $k < n$.


In other words, every row $i$ (and column $i$) is represented by a k-dimensional vector $\vec \beta_i$. Every off-diagonal element of matrix $\hat C$ is given by a scalar product of two corresponding vectors:

$ C_{ij} = (\vec \beta_i, \vec \beta_j) $

The diagonal elements of the covariance matrix are given by the following expression:

$ C_{ii} = (\vec \beta_i, \vec \beta_i) + \gamma_i^2 $


So, my question is if there is a fast analytical method to calculate $\vec \gamma$ and $\hat \beta$? Or, alternatively, if an iterative method to calculate $\vec \gamma$ and $\hat \beta$?

I probably need to add, that I try to minimize the squared deviations between the original covariance matrix and the matrix that tries to approximate it.