How does "per-coordinate normalization" affect covariance matrix norm?

115 Views Asked by At

Suppose we normalize our data so that per-coordinate variances are one, can we bound the effect this has on the spectral radius of the covariance matrix?

More precisely, suppose our data-points are stacked as rows in $n\times d$ data matrix $X$ with $X^TX$ having full-rank, trace $d$ and spectral radius $\sigma$. Compute normalized spectral radius $\sigma_n$ as follows

$$ \begin{array}{lll} L&=&\text{Diag}(X^TX)\\ \sigma_n&=&\lambda_\max(L^{-1/2}X^TXL^{-1/2}) \end{array} $$

Can we put bounds on $\sigma/\sigma_n$?

In numerical simulations, normalized spectral radius stays close to original spectral radius. For instance, take some random $500 \times 500$ covariance matrices with eigenvalue decay $O(1/k)$. Applying normalization operation decreases spectral radius by 15-25%

enter image description here

1

There are 1 best solutions below

0
On

It is hard to bound without extra assumptions. In general, diagonal preconditioning won't necessarily improve the conditioning. See $H = \begin{bmatrix}0.17 & -0.49 & -0.19 & -0.36 \\ -0.49 & 2.34 & 0.71 & 1.79 \\ -0.19 & 0.71 & 0.32 & 0.53 \\ -0.36 & 1.79 & 0.53 & 1.44 \end{bmatrix}$