How does one minimize/maximize the Lagrangian if its gradient is non-linear?

299 Views Asked by At

If one is trying to maximize(or minimize) the Lagrangian

$$\mathcal{L}(x,y,\lambda) = f(x,y) - \lambda \cdot g(x,y)$$

its fairly straightforward that this is achieved by solving:

$$\nabla_{x,y,\lambda} \mathcal{L}(x , y, \lambda)=0. $$

From the examples that I have seen, the Lagrangian never has a degree higher than 2, and so its gradient is always linear and so the above equation can be solved as a system of linear equations. The examples I have seen where there are non-linear terms in the above equation is solved by hand.

Is there a go-to method for optimizing with equality constraints for any function where you may not have linear gradients? Is some iterative method (i.e: gradient decent/ascent) the go-to method?