Do convex combinations of projection matrices majorize the probability vector, i.e. $\sum_k p_k P_k\succeq \boldsymbol p$?

177 Views Asked by At

Consider a convex combination of normal projection matrices with positive coefficients: $$C\equiv \sum_k p_k P_k,$$ where $p_k>0$, $\sum_k p_k=1$, and $P_k=P_k^\dagger=P_k^2$.

If the $P_k$ are mutually orthogonal, i.e. $\newcommand{\tr}{{\operatorname{tr}}}\tr(P_j P_k)=\delta_{jk}$, then the (nonzero) eigenvalues of $C$ equal the coefficients $p_k$.

What can we say about the case of $P_k$ not mutually orthogonal? Do the eigenvalues have to be different than $(p_k)_k$ in this case? Can we say something about how "ordered" they are, that is, whether $\sigma(C)\succeq\boldsymbol p$, where $\sigma(C)$ is the vector of eigenvalues of $C$, $\boldsymbol p$ is the vector of coefficients $p_k$, and $\preceq$ refers to the majorization preordering?

The notation and orthogonality conventions used above assume rank-$1$ projections, i.e. $\tr(P_k)=1$. However, if this assumption turns out to not be relevant, feel free to lift it. If the projections have rank $r$, the orthogonality would then read instead $\tr(P_i P_j)=r \delta_{ij}$, and probably some degeneracies in the spectra will have to be dealt with.


As a concrete example, consider the $2\times2$ projections $P_0,P_+$ defined as $$\newcommand{\e}[1]{{\mathbf{e}_{#1}}} P_0\equiv \e0\e0^*=\begin{pmatrix}1 & 0 \\ 0 & 0\end{pmatrix}, \\ P_+\equiv \e+(\e+)^* \equiv \frac12(\e0+\e1)(\e0+\e1)^*=\frac12\begin{pmatrix}1 & 1 \\ 1 & 1\end{pmatrix}.$$ Then, for all $p\in[0,1]$, the eigenvalues of $C_p\equiv p P_0 + (1-p) P_+$ are $\lambda_\pm = \frac12\left(1\pm \sqrt{p^2+(1-p)^2}\right)$, and thus $\boldsymbol\lambda\equiv\sigma(C_p)\succeq\boldsymbol p\equiv (p,1-p)$, as easily seen plotting the function:

we can clearly see that here the vector of eigenvalues of any convex combination of $P_0$ and $P_+$ majorizes the coefficients of the convex combination themselves.

Generating random convex combinations of projectors onto random vectors I also always find the result to hold: $\sigma(C)\succeq\boldsymbol p$.

Does this result hold in the general case? If so, how do we prove it?

1

There are 1 best solutions below

7
On BEST ANSWER

Without loss of generality, assume that $\{p_i\}$ is decreasing. For any $r$ between $1$ and $n$, since the $P_i$s are rank-one orthogonal projections, $S_r=\sum_{i=1}^rp_iP_i$ is a positive semidefinite matrix of rank $\le r$. It follows that the sum of the largest $r$ eigenvalues of $S_r$ is equal to the trace of $S_r$. Therefore $$ \sum_{k=1}^r\lambda_k^\downarrow(S_n) \ge\sum_{k=1}^r\lambda_k^\downarrow(S_r) =\operatorname{tr}(S_r) =\sum_{k=1}^rp_k\tag{1} $$ and equality holds when $r=n$. Thus $\lambda^\downarrow$ majorises $p$ from above. From $(1)$ we also obtain $$ \sum_{k=1}^{n-r}\lambda_k^\uparrow(S_n) =\operatorname{tr}(S_n)-\sum_{k=1}^r\lambda_k^\downarrow(S_n) \le \operatorname{tr}(S_n)-\sum_{k=1}^rp_k =\sum_{k=r+1}^n p_k. $$ Therefore $p$ majorises $\lambda^\uparrow$ from below.