I want to obtain the necessary condition(s) such that the equilibrium (i.e., $x=-\frac{b}{a}$) of the following nonlinear (single and not a system of) ODE is asymptotically stable.
$\frac{dx}{dt}=e^t (ax+b); a,b \in ℝ$
Can I use the Lyapunov stability theorems? If so, how? I know that I can solve this specific ODE analytically and then discuss its asymptotic behavior, but suppose that I do not want to solve it.
Intuitively, it seems obvious that the 1st-order system is asymptotically stable when
$$a < 0,$$
because $b$ can be treated as an external input.
Let $a = - \alpha$, where $\alpha > 0$, then the solution is given by
$$x(t) = \frac{\alpha x_{0} - b}{\alpha} e^{\alpha - \alpha e^{t}} + \frac{b}{\alpha}.$$
where $x_{0}$ is the initial value. If we set $\alpha = 1$, then the solution becomes simply
$$x(t) = \left(x_{0} - b\right) e^{1 - e^{t}} + b.$$