Regarding the Lyapunov stability, we check if a nonlinear system stays near the equilibrium point or approaches to e.p. as time goes to infinity, when it is disturbed.
Let's assume that we have a nonlinear system (an automobile) and designed an optimal controller. The controller executes the given driver input for safely driving by controlling the braking forces and steering angle. The case might be a lane change at high velocities.
In this case, or in similar cases like control of an airplane/AUV/ship/etc., there is a moving (not at rest) system; the controller takes the system from a system of states to another system of states. How can we talk about the stability in this case? If I designed a controller by using an unknown method, say, my method, how can I check that the closed loop system is stable?
Edit: It is an autonomous system. My point is how I can prove that the stability is guaranteed for that controlled system. How will I know that my controller will not make the dynamic system unstable?
Let me recollect the above pieces of comments to an answer.
There is a difference between stability of a control, the stability of an autonomous system and the stability of a non-autonomous system. If you have a controller coupled to an autonomous system, then you have as result a non-autonomous system and the stability you look for will be about the whole non-autonmous system.
In general this is about an own stack of theory see here >>>
And on StackExchange here >>>