I am new to the field of eigenvalue optimization. Say I have a symmetric matrices $A(x)\in\mathbb{R}^{2n\times2n}$ which depend on $x\in\mathbb{R}^n$. I now want to find situations in which one and only one eigenvalue of $A$ is negative.
The background to this is that I'd like to find first order saddle points of a function $f(x)$ and $A(x)$ is the corresponding Hessian Matrix. I know about methods from computational chemistry like minimum mode following but I am wondering if I could also optimize the eigenvalues with respect to $x$.
I read about the question of minimizing the maximum Eigenvalue. My idea so far was to minimize somehow \begin{align} \frac{\lambda_1(A(x))}{\lambda_2(A(x))} \end{align} with respect to $x$, where $\lambda_1$ and $\lambda_2$ are the lowest and the second lowest eigenvalue of $A$, respectively. Certainly the minimum solution would be $\lambda_1<0$ and $\lambda_2>0$. But I expect that the denominator crossing zero would cause immense problems.
So my question: Do you know if such optimizations exist in Literature or do you have ideas how to approach my problem?