For a convex function $f: S \to \mathbb{R}$, where $S \subset \mathbb{R}^{n}$, if $X^{*}$ is a local min, (hence global min), we can check along every line restriction to verify $X^{*}$ is indeed the local min. However, oftentimes in optimization, the function is not convex, or no longer convex after a change of variable. Therefore this does not always work
So I wonder in general, for $C^{2}$ function, $f:S \to \mathbb{R}, S \subset \mathbb{R}^{n}$, does there exist a necessary sufficient, or very weak sufficient condition to guarantee that if $X^{*}$ is the local min along every line restriction, it is actually a local min?
Of course, checking for gradient and hessian works in the interior, but this is not possible on the boundary. Also, what about the case if we lift the $C^{2}$ restriction?
Edit: I have changed this question from the original version so it can be more focused on my question.