I'm reviewing Non-linear Programming and am getting a little confused on dimensionality of the objective function sometimes. Observe:
min $z = (x-3)^2 + (y-3)^2$
s.t.
$x^2/9 + y^2/4 <= 1$
$x^2 + 3y = 3 $
Thus this is a parabloid subject to a $R^2$ constraint boundary consisting of a polynomial and an ellipse.
The textbook claims this can be solved graphically given the constraints boundary in the $R^2$ plane.
More critically I'm trying to understand the relevance of constant gradient vectors of the constraint boundaries and what role they play in the ability to solve NLPs (noting the above gradient vectors are non-constant but mainly the question is for LP programs that are constant). My understanding is if the constraint boundary gradients are not constant, or the constraint set is potentially non-convex then solving NLPs becomes exponentially harder, but I'm unclear as to why.