I have to find the smallest $\sigma$ that satisfies the following expression:
$\lambda_{m}(A + \sigma I) < \delta, \quad \delta > 0$
where $A$ and $I$ (identity) are matrixes and $\lambda_{m}()$ is an opperator that returns the smallest eigenvalue of its argument.
So the question is: which is the smallest $\sigma$ such that $\lambda_{m}(A + \sigma I) < \delta$?
I have no ideia where to begin.
I'm sorry for the lack of rigor on posting the question.
The problem I was trying to state is a problem related to mathematical optimization, more specifically related to the multidimensional modified Newton algorithm.
In this algorithm, one must define a $\delta > 0$ and find the smallest $\sigma \geq 0$ such that $$\lambda_{m}(\nabla^{2}f(x) + \sigma I) > \delta$$ where $\lambda_{m}()$ is an operator that returns its argument smallest eigenvalue, $x$ is a vector and $f$ is the function being minimized.
This whole procedure is performed in order to find a viable direction for finding the function's minimum.
Turns out the answer was much much simpler than what I expected. I had forgotten a basic property of matrices which states that $eigenvalues(A + kI) = eigenvalues(A) + k$. In this way, the answer to my question is simply $$\sigma > \delta - \lambda_{m}(\nabla^{2}f(x))$$