Consider two $N \times N$ matrices constructed as follows:
$$ A = \sum_{k=1}^K a_k x_k x_k',\quad B = \sum_{k=1}^K b_k x_k x_k' $$
where $x_k\in\mathbb{R}^{N}$ are real vectors and $a_k,b_k > 0$ are positive real numbers ($k=1,\dots,K$).
Let $0 \le \alpha_1 \le \dots \le \alpha_N$ and $0 \le \beta_1 \le \dots \le \beta_N$ be the sorted eigenvalues of $A$ and $B$, respectively.
I intuitively expect the following claim to be true:
Claim. If $a_i \le b_i$ for all $i=1,\dots,K$, then $\alpha_i \le \beta_i$ for all $i=1,\dots,N$.
I have done some numerical experiments with random matrices and it holds. But I'm not sure how to prove it.
This is not a textbook exercise, so I am not sure if it is true or not.
Can we prove the claim, or find a counterexample?
You can show it by the minmax-principle: $$\alpha_j = \min_{dim S=j}\max_{x \in S}\frac{x^T A x}{x^Tx}$$ With similar definition of the $\beta$-terms. Let $U$ be the subspace on which the minimum for $\beta_j$ is achieved and denote by $u$ the corresponding (unit-length-)vector which maximizes the expression $\frac{x^T B x}{x^Tx}$ relative to $U$. Setting this vector into the equation for $\alpha_j$, we get: $$\alpha_j \leq u^TAu$$ while we also have $$u^TBu \leq \beta_j$$ Now $u^TAu-u^TBu<0$ and the result follows