Gradient of extremum with respect to constraints

53 Views Asked by At

I have a smooth convex function $h:\mathbb{R}^n\to\mathbb{R}$ that I would like to minimize. But I would like to restrict the domain of optimization to a constrained minimizer of a different function. More precisely, given smooth convex functions $f:\mathbb{R}^n\to\mathbb{R}$ and $g_i:\mathbb{R^n}\to\mathbb{R}$ for $i = 1,\ldots,m$ with $m < n$, define $$D = \{argmin_{x\in\mathbb{R}^n} f(x)\quad \text {subject to } \vec{g}(x) = \vec{c}\quad \forall \vec{c} \in \mathbb{R}^m\}$$. I wish to find $$argmin_{x\in D}h(x)$$

To reparameterize the problem, define $$x^*(\vec{c}) = argmin_x f(x)\quad \text {subject to } g_i(x) = c_i$$

I now have a $m$ parameter optimization problem $$\min_{\vec{c} \in \mathbb{R}^m} h(x^*(\vec{c}))$$.

I would like to do a local search using Newton's method. I am requesting help in computing the jacobian and the hessian of $x^*(c)$ w.r.t. $\vec{c}$ in terms of derivatives of $f$ and $g_i$. I would also appreciate an alternative approach to solve this optimization problem.