I need to find the minimum/maximum of a nonlinear function but the constraints in the optimization problem make it tougher to solve (not a convex problem). I don't have a good global optimization algorithm available, so instead, I use a local optimization algorithm with gradients and check the solution by giving the algorithm several different starting values. If all of these starting values lead to the same solution, am I actually performing global optimization?
I'm interested in knowing whether this kind of an approach (using local algorithm, checking solution with multiple starting values) is reasonable and I could claim that I have (likely) found the global optimum?