In Control & System theory, does a simple integrator consider a static system? i.e.: $$\dot{x}=u , y=x$$ While, a general nonlinear dynamical system can be described by :
$$\dot{x}=f(x,u) , y= h(x)$$ Where $$x,u,y$$ is the state, input and output , respectively.
If my understanding is correct, can I have a formal definition for a static system.
If a system is dynamic, this means that "the future states depend on the current one", so it evolves with time. For example, if there is a derivative term in the system model, then it is a (continuous) dynamical system.
A static system solution does not depend on time.
Also see http://www.scholarpedia.org/article/Dynamical_systems.