Bounding the smallest eigenvalue of a matrix from below

59 Views Asked by At

Consider $\lambda_i \in (-1,1)$ for $i =1,\dots,n$, and $\omega \in (-\pi,\pi]$ and consider the matrix $X = (X_{uv})$ with entries: $$ X_{uv} = \frac{1}{1-\lambda_u e^{-i\omega} - \lambda_v e^{i\omega} + \lambda_u \lambda_v}. $$ where $i = \sqrt{-1}$. Note that $X$ is a Hermitian matrix, hence its eigenvalues are all real.

Can we give a lower bound on the smallest eigenvalue of $X$ that somehow allows one to show that the smallest eigenvalue stays away from zero as $n \to \infty$ if $\{\lambda_u\}$ stays away from $1$ and $-1$, say $\lambda_u \in (-1+\epsilon,1+\epsilon)$, except perhaps for a set of $\omega$ of measure zero ?

EDIT: It seems that minimizing $z^T Xz$ over unit $\ell_2$ norm vectors $z$ is a possible approach.