We all know the 2nd derivative test in its original form, if $f'(x)=0$, then if $f''(x)<0$ the point is max, and if $f''(x)>0$ the point is min. We also know the generalization for the case is inconclusive with one variable: (I) https://en.wikipedia.org/wiki/Derivative_test#Higher-order_derivative_test
We also know the generalization for multi variable:(II) https://en.wikipedia.org/wiki/Second_partial_derivative_test#Functions_of_many_variables
The question is if there is a generalization of the two, say for multivariable if none of the conditions are met, then we can take the next derivative until we find a derivative which is not zero and use the conditions in (II) to decide whether it's a max or min.
So prove or disprove, one can just take nth derivative and check using the II conditions, that if an extremum is a max or min, or disprove via a counterexample .
There cannot be such algorithm in general (As long as NP!=P) Suppose there was, let's call it A. the number of condition is constant. calculating matrix multiplication is polynomial. So A is polynomial. calculating n-th derivative is polynomial. Now use the algorithm A to find the if it is the minimum or not. so our algo is in P. But according to https://arxiv.org/abs/1012.0729 and other papers, finding local minimum is NP hard. so one of the following: