Suppose I have a measure $\alpha$ defined by $$ \alpha(x) = \sum_{i : \lambda_i < x} w_i $$ for nodes $\lambda_i$ and positive weights $w_i$, $i=1, \ldots, n$ (if we need $\alpha(x)$ to be differentiable that's okay too).
Suppose that $\mu$ is a Gaussian-Quadrature rule for $\alpha$ supported on $k$ points $\theta_i$ with weights $d_i$ $$ \mu(x) = \sum_{i : \theta_i < x} d_i $$
That is, for any polynomial $p$ of degree $2k-1$, $\int p d\alpha = \int p d\mu$. Or equivalently, $\theta_i$ are the eigenvalues of the Jacobi matrix associated with the first $k$ orthogonal polynomials of $\alpha$ and $d_i$ are the squares of the first components of the eigenvectors.
Is it true that $\mu(\theta_i^-) \leq \alpha(\theta_i) \leq \mu(\theta_i^+)?$ Here $\theta_i^-$ and $\theta_i^+$ are infinitesimally close to $\theta$ to the left and right.
I imagine this is somehow related to some kind of interlacing property for tridiagonal matrices, but the question not just about nodes.
For reference, here is an image which demonstrates the behavior I'm asking about. $\alpha(x)$ is the black curve and $\mu(x)$ is the blue curve, and he question is if the curves intersect at all of the vertical jumps in the blue curve.
EDIT: I have found on a Theorem from Fisher's book "Polynomial Based Iteration Methods for Symmetric Linear Systems" attributed to Karlin and Shapley but apparently previously given by Stiletjes:
This implies the observation. But I'm still interested in any answers which

