I am self studying control theory and I have a doubt. Can someone please help.
Question is - Find examples to show that even if a linear system is stable then also corresponding non linear system may or may not be stable.
So, I need to construct two counterexamples 1. In which non linear system is unstable. 2. Another in which non linear system is stable.
Can somebody please help.
Note that if the linearized system at an equilibrium is stable, then the nonlinear system is at least locally stable around that equilibrium. Two examples for your cases with that in mind:
First case: Linearization stable, Nonlinear system globally stable:
$$ \dot{x} = -x -x^3 \tag{1} $$
Lyapunov function $V(x) = \frac{1}{2}x^2$ is $\dot{V}(x) = -x^4 - x^2 < 0$ for $x \neq 0$. Nonlinear system $(1)$ is globally stable. Linearization at $0$ is $\dot{x} = -x$, eigenvalue at $-1$, stable.
Second case: Linearization stable, Nonlinear system globally unstable (only localy stable):
$$ \dot{x} = -x + x^3 \tag{2} $$
With initial condition $x(0) = 2$ the solution of $(2)$ is $x(t) = \sqrt{\frac{1}{1 - \exp(2 t + \log(3/4))}}$, which has finite escape time so $(2)$ is unstable for some initial conditions. Linearization at $0$ is again $\dot{x} = -x$, which is stable. Also, $(2)$ cannot be globally stable because it has three equilibrium points: $0, 1$ and $-1$.