Prove or disprove: Maximum trace of correlation matrix transform when random variables are reordered

137 Views Asked by At

Consider an ordering $i=(1,2,3,...,n),n \in \mathbb{N}$ of linearly independent random variables $X_1,X_2,X_3,...,X_n$ with resulting positive definite correlation matrix in notation of column vectors $C_i = (c_i{(1)},c_i{(2)},c_i{(3)}, ...,c_i{(n)})$. Further, $s_i=\{\Bbb{1}^{T} c_i{(1)},\Bbb{1}^{T} c_i{(2)},\Bbb{1}^{T} c_i{(3)},...,\Bbb{1}^{T} c_i{(n)} \}$ is the associated sequence of sums taken over each column vector's elements. All possible orderings $i$ of random variables $X_1,X_2,X_3,...,X_n$ create a set $\Omega = \{C_i | i=1,2,3,...,n!\}$ of positive definite correlation matrices.

Each $C_i\in \Omega$ can be represented with a factorisation $C_i= \sqrt{F_i}^{T}\sqrt{F_i}$ where the square root is taken element-wise (Hadamard root), $F_i$ is an upper triangular matrix with non-negative elements smaller than $1$, positive diagonal elements and each column sum equal to $1$.

Can anyone prove or disprove the following proposition?

The $trace \left(F_i\right)$ achieves its maximum on the set $\Omega$ exactly for an ordering $i$ of random variables where the sequence of column vector sums $s_i$ is monotone increasing.