Optimization with constrained gradient

37 Views Asked by At

I am trying to use Lagrange multipliers to solve a constrained optimization problem. The problem looks like:

$$\text{Minimize}\hspace{3mm}R(b_1(x),b_2(x))$$ $$\hspace{11mm}\text{Subject to}\hspace{3mm}c_1 \nabla b_1 + c_2 \nabla b_2 = 0.$$

Naively applying the usual vector calculus methods results in needing the derivatives with respect to $b_1$ and $b_2$ of $R$, which is easy enough to compute in the problem I am solving. I also need the derivatives with respect to $b_1$ and $b_2$ of the second equation, which I am not sure how to do.

Is there a simple way to compute the derivatives that I need? If not, is there a standard formulation that will work for this type of problem?