I've seen this stopping criterion for iterative optimization algorithms such as Newton-Ralphson, Gradient Descent, etc. However, I do not remember its name nor where I saw it. It seems that this criterion is quite good and less restrictive than using the norm of the gradient.
My questions are the following:
- Do you know the name of the criterion below?
- Is it considered a good stopping criterion?
Here is the criterion:
Let a function $f: \mathbb{R}^N \rightarrow \mathbb{R}$ be a continuous and differentiable function. Let $\varepsilon \in \mathbb{R}$ be a small value. We define the following value at $x_k \in \mathbb{R}^N$:
$$ sc_k = \max_{i=1,\dots, N} \frac{||x_{k,i}\cdot(\nabla f(x_k))_i||}{f(x_k)} $$
We stop the optimization algorithm if $sc_k < \varepsilon$.
Thank you!