Consider a finite-dimensional complex (hermitian, for simplicity) matrix whose spectrum has a minimum $M_0$ and maximum $M_1$.
By the variational theorem (this is the way it's called in physics, I'm not sure what is its name in mathematics, but I believe it's related to the Min-Max theorem), any vector whose expectation value (Rayleigh–Ritz quotient) is $M_0$ ($M_1$) is also an eigenvector with corresponding eigenvalue $M_0$ ($M_1$). This should also hold when $M_0$ ($M_1$) is degenerate (its corresponding eigenspace has dimension greater than one). More explicitly, the expectation value of a vector $v$ on a matrix $A$ is:
$$ \frac{(v,Av)}{(v,v)} $$
where $(\cdot,\cdot)$ is the dot product and $Av$ is the action of $A$ on $v$, if viewed as operator on an abstract vector space, or the row-by-column multiplication of the matrix $A$ and column vector $v$.
The variational theorem then states:
$$ \frac{(v,Av)}{(v,v)} = M_0 \iff Av = M_0v$$ $$\frac{(v,Av)}{(v,v)} = M_1 \iff Av = M_1v $$
Is it true then that, given a state whose expectation value is $M_0$ ($M_1$), it can only be written as a linear of eigenvectors with eigenvalue $M_0$ ($M_1$).
To be precise, given a finite set of vectors $w_k$ whose linear combination is $v$ (with expectation value $M_0$, thus an eigenvector by the variational theorem), then all $w_k$ are necessarily eigenvectors with eigenvalue $M_0$: $$ \frac{(v,Av)}{(v,v)} = M_0 \iff v = \sum_k c_k w_k, Aw_k = M_0w_k$$
I would assume this to be true, since any contribution with eigenvalue larger than $M_0$ (smaller than $M_1$) would result in an increase (decrease) of the expectation value above $M_0$ (below $M_1$), but I don't know how to prove it.
Am I missing something? Can anyone give me a counterexample? Have I made an additional assumption I'm forgetting, it being absent in the "physics" statement of the variational theorem?
So first let's clear up the precise statement of the variational theorem; the Hermitian assumption is necessary, not just "for simplicity," and without it the theorem is false. If $A$ is an arbitrary matrix then the Rayleigh quotient $R_A(x) = \frac{\langle Ax, x \rangle}{\langle x, x \rangle}$ depends only on the Hermitian part of $A$; we can write $A$ uniquely as sum of a Hermitian and a skew-Hermitian matrix, namely
$$A = \frac{A + A^{\dagger}}{2} + \frac{A - A^{\dagger}}{2}$$
and we have $\langle Ax, x \rangle = \langle \frac{A + A^{\dagger}}{2} x, x \rangle$. So in this case the minimum and maximum of $R_A$ is controlled by the eigenvalues of $\frac{A + A^{\dagger}}{2}$, not of $A$. (I misspoke in the comments about singular values; those control the minimum and maximum of a different function, namely $\frac{\langle Ax, Ax \rangle}{\langle x, x \rangle}$.)
Anyway, the answer to your question is no, the $w_k$ can be arbitrary and in particular don't have to be eigenvectors of $A$ at all. This is not in contradiction with the fact that their Rayleigh quotients $R_A(w_k)$ must be between $M_0$ and $M_1$ because, as you say, if you expand out $\langle Av, v \rangle$ in terms of the $w_k$ there are cross terms $\langle Aw_i, w_j \rangle$ contributing to the sum. The cross terms do disappear if the $w_k$ are eigenvectors of $A$, which is the question I thought you were originally asking; in that case the $w_k$ would have to be eigenvectors with eigenvalue $M_0$ resp. $M_1$.