I'm looking for a more up-to-date book covering similar material as the second half of Luenberger's Optimization by Vectors Space Methods. That book covers Lagrange multiplier necessary conditions in Banach spaces. It also shows a bit how the Pontryagin maximum principle from optimal control theory can be viewed as a special case of such Lagrange multiplier necessary conditions. But Luenberger's book seems a bit dated and leaves me with a lot of questions. Is there a more up-to-date book covering this material?
I've looked at Clarke's Functional Analysis, Calculus of Variations, and Optimal Control. But this book seems to focus only on calc of var and control without emphasis on general infinite dimensional optimization in normed spaces (as in Luenberger). In particular, it doesn't seem to show how the maximum principle can be derived using Lagrange multipliers with a continuum of constraints, which I would like to see.
Finally, I found this paper "The infinite dimensional Lagrange multiplier rule for convex optimization problems", but I believe it is only for convex problems.
Thank you.
OP here. Jahn Introduction to the Theory of Nonlinear Optimization seems to be a good place to start, specifically Ch5 Generalized Multiplier Rules.
That said, his proof of the maximum principle is pretty hairy and might benefit from some high-level description or sign-posting. I think it also would have been nice if he more explicitly discuss the special case where one of the constraints is that one decision variable function is the derivative of another.
https://link.springer.com/book/10.1007/978-3-030-42760-3