Consider a nonlinear program (specifically not convex): $$ \underset{ x \in \mathbb{R}^{n} }{ \text{min} } f(x) $$ $$ \text{s.t.} g(x) \leq 0 $$ Where $f:\mathbb{R}^{n} \to \mathbb{R}$, $g:\mathbb{R}^{n} \to \mathbb{R}^{m}$, and $f,g_{1},g_{2}\dots ,g_{m}$ are continuously differentiable. Suppose strong duality holds, i.e. the primal and the dual optimal are attained and equal, and $x^{*} \in \text{int}(S)$ i.e. the interior of the feasible region, is the optimal solution to this primal program, if $\lambda ^{*}$ satisfies the KKT gradient condition for the dual program, is $\lambda ^{*}$ the dual optimal solution?
For KKT gradient (contrasting the saddle form), I meant $\nabla_{x}L(x^{*},\lambda ^{*}) = 0$, $g(x^{*})\leq {0}$, and $\forall 1\leq i\leq m, \lambda_{i}^{*}g_{i}(x^{*}) = 0$,
I know if $f(x^{*}) = v(\lambda ^{*})$, then $\lambda ^{*}$ is a sensitivity vector, thus it should satisfy the KKT gradient condition. So I guess there exists a $\lambda ^{*}$ that satisfies the KKT gradient form? However, does all the $\lambda ^{*}$ that satisfies the KKT gradient form necessarily the dual optimal solution?
I have read some posts and noticed the word regular optimal points, but I cannot find the exact definition on the web, it would be helpful if you can mention what it means as well.