Importance of stability in optimal control

205 Views Asked by At

This is a quite general question, so even answers highlighting particular papers or books will be very helpful.

Given an optimal control problem of the form $$\min_{u\in\mathbb{R}^k} Loss(x,u) \;\;s.t. \;\; \dot{x}(t) = f(t,x,u),$$

when is the stability under small perturbation in the initial data of the plant equation $\dot{x}(t) = f(t,x,u)$ relevant? In these cases, why is it relevant?

For simplicity let us consider a Bolza problem, so $Loss(x,u) = f_u(\Phi^T(x_0))$, where $\Phi^T$ is the flow of the ODE at time $T>0$.

I am quite new to control problems, so I know the importance of stability for dynamical systems in general and even the various available notions, but I don't get why for control problems it should be important.