I have an optimization problem of the form
$$ \min_u f(u)\\ \text{subject to } \tilde{y} = \mathop{\text{argmin}}_y g(u,y), $$ where $\tilde{y}$ is provided as a constant.
What are the best algorithms to solve this kind of non-linear optimization?
I found this previous post above differentiating through $\mathop{\text{argmin}}$, but I'm having trouble putting this into a sequential quadratic programming solver, sensitivity analysis, or something like that.
I'm happy to take first (and second) derivatives of $f$ and $g$ with respect to their arguments, but $f$ and $g$ are not necessarily convex.