I have a question in mind. Consider the following non-convex optimization problem: $$\begin{aligned}\underset{\boldsymbol{x}}{\min}\quad & f_{0}\left(\boldsymbol{x}\right) \\ \text{s.t.}\quad & f_{i}\left(\boldsymbol{x}\right)\leq a_{i}, i=1,\ldots,M,\end{aligned}$$ where $f_{0}\left(\boldsymbol{x}\right)$ is a convex function and $f_{i}\left(\boldsymbol{x}\right)\leq a_{i}, i=1,\ldots,M$ are non-convex differentiable functions. Suppose I solve this optimization problem via sequential convex programming where the non-convex constraints are approximated via convex upper bound functions, $\hat{f}_i\left(\boldsymbol{x}\right)$, and satisfied the following: $$\hat{f}_i\left(\boldsymbol{x}\right)\geq f_{i}\left(\boldsymbol{x}\right),\forall \boldsymbol{x},\forall i,\\ \hat{f}_i\left(\boldsymbol{x}\right)=f_{i}\left(\boldsymbol{x}\right),\forall i,\\ \nabla\hat{f}_i\left(\boldsymbol{x}\right)=\nabla f_{i}\left(\boldsymbol{x}\right),\forall i,.$$ If I obtain a solution, which I am sure that it satisfies the KKT conditions (i.e. necessary condition), can I conclude that the solution is a local optimal solution?
My viewpoint is suppose $x^\star$ is the accumulation solution that I obtained from sequential convex programming. Then I know that, at $x^\star$, the original problem is locally equivalent to a convex optimization problem (i.e., the hessian is positive semidefinite locally). Since the objective function and constraints are differentiable, it must be locally optimal as well. My argument is abit informal but that's the idea.