Relation between the weights of x vectors and corresponding eigenvalue of the covariance matrix of vectors?

75 Views Asked by At

There are ${{x}_{i}},\:i=1,\ldots ,N$ vectors with K dimensions. The covariance matrix of weighted samples is defined as below:

$M=\frac{1}{N}\sum\limits_{i=1}^{N}{{{\alpha }_{i}}{{x}_{i}}x_{i}^{H}}$

Where ${{\alpha }_{i}}$ values are positive constants. M is expressed in terms of their eigenvalue decomposition, i.e, $M=U\Lambda {{U}^{T}}$, where $U$ is the orthonormal eigenvector matrix of $M$, and $\Lambda$ is diagonal matrix with diagonal entries, which are eigenvalues of $M$ arranged in ascending order i.e. ${{\lambda }_{1}}\le {{\lambda }_{2}}\le \cdots \le {{\lambda }_{K}}$.

Are there any mathematical relationships between ${{\alpha }_{i}}$ and the eigenvalues of ${{\lambda }_{i}}$?