Suppose that $$ \lim_{t\rightarrow \infty}\left(\dot{x}(t)+\gamma x(t)\right)=0,\quad \gamma>0. $$ How can I prove $$ \lim_{t\rightarrow \infty}x(t)=0~? $$ Please give a strict proof. Thanks!
2026-04-03 06:20:03.1775197203
Prove a limit with condition specified at infinity
60 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
Ley $z(t)=\dot x(t)+\gamma x(t)$, then $(x(t)\mathrm e^{\gamma t})'=z(t)\mathrm e^{\gamma t}$ hence $$x(t)=\mathrm e^{-\gamma t}x(0)+\mathrm e^{-\gamma t}\int_0^tz(s)\mathrm e^{\gamma s}\mathrm ds.$$ Now, using the hypothesis that $\gamma\gt0$, a direct epsilon-delta proof (or, in this case, an epsilon-$t_0$ proof...) shows that if $z(t)\to0$ when $t\to\infty$ then $x(t)\to0$ when $t\to\infty$.
So, let $\varepsilon$ be positive, there exists $t_0$ finite such that $|z(t)|\leqslant\varepsilon$ for every $t\geqslant t_0$ hence $|x(t)|\leqslant\ldots$