Let $\Sigma$ be a covariance matrix of some distribution. Then $\Sigma^{-1}$ is the precision matrix.
Question: Does $\|\Sigma\|$ or $\|\Sigma^{-1}\|$ have any meaning (for any norm, though I ask in particular for the spectral norm).?
Let $\Sigma$ be a covariance matrix of some distribution. Then $\Sigma^{-1}$ is the precision matrix.
Question: Does $\|\Sigma\|$ or $\|\Sigma^{-1}\|$ have any meaning (for any norm, though I ask in particular for the spectral norm).?
Copyright © 2021 JogjaFile Inc.
Let $x$ be a random vector with covariance matrix $\Sigma$ and let $e$ be a vector such that $||e||_2=1$. The projection of $x$ on $e$ is given by $(e^Tx)e$. Notice that $e^Tx$ can be considered as the coordinate of the projection along the direction given by $e$. Also notice that $Var(e^Tx) = e^T \Sigma e$. Since $\Sigma$ is positive semi-definite, the maximum of the above variance is obtained at $e^*$, which is the eigenvector corresponding to $\lambda^*$, the maximum eigenvalue of $\Sigma$. Then $Var(e^{*T}x) = e^{*T} \Sigma e^* = \lambda^* e^{*T} e^* = \lambda^* = ||\Sigma||_2$.
That is, $e^*$ is the direction such that the projection of $x$ on it reaches the maximum variance. $||\Sigma||_2$ is the maximized variance. Principal component analysis (PCA) is based on the above.