I have read several times that Pontryagin's Maximum (or Minimum) Principle provides a "necessary" condition for a control, $u(t)$, to be an optimal control of a system of ODEs, $\dot x = f(t,x,u)$. I always understood this to mean that the optimal control might be one of many functions satisfying Pontryagin's conditions.
My question is: can we say anything else about controls that satisfy these conditions? Fleming and Rishel refer to these controls as "extremals," which suggests that these are local maxima/minima, but not necessarily global ones. That's definitely a stronger statement than what I said above. Am I missing something?
For a little more context, F & R show that the optimal control to the classic lunar lander problem is for the lander to first free-fall, then blast its thrusters at maximum force. They do this by first showing that such a control satisfies Pontryagin's conditions, and then showing that any other control would contradict those conditions. Based on my reading, this shows that there is exactly one candidate for optimal control, but doesn't necessarily prove that an optimum exists.
Apologies if this question is basic or redundant — I have not found another StackExchange thread that answers it explicitly.
Nice question(s)! Here are some pointers.
[1] : Liberzon, Daniel. Calculus of variations and optimal control theory: a concise introduction. Princeton university press, 2011.