Prove a limit with condition specified at infinity

60 Views Asked by At

Suppose that $$ \lim_{t\rightarrow \infty}\left(\dot{x}(t)+\gamma x(t)\right)=0,\quad \gamma>0. $$ How can I prove $$ \lim_{t\rightarrow \infty}x(t)=0~? $$ Please give a strict proof. Thanks!

2

There are 2 best solutions below

2
On BEST ANSWER

Ley $z(t)=\dot x(t)+\gamma x(t)$, then $(x(t)\mathrm e^{\gamma t})'=z(t)\mathrm e^{\gamma t}$ hence $$x(t)=\mathrm e^{-\gamma t}x(0)+\mathrm e^{-\gamma t}\int_0^tz(s)\mathrm e^{\gamma s}\mathrm ds.$$ Now, using the hypothesis that $\gamma\gt0$, a direct epsilon-delta proof (or, in this case, an epsilon-$t_0$ proof...) shows that if $z(t)\to0$ when $t\to\infty$ then $x(t)\to0$ when $t\to\infty$.

So, let $\varepsilon$ be positive, there exists $t_0$ finite such that $|z(t)|\leqslant\varepsilon$ for every $t\geqslant t_0$ hence $|x(t)|\leqslant\ldots$

0
On

Hint: Let $y(t) = e^{\gamma t} x(t)$. Then $\lim_{t \to \infty} e^{-\gamma t} \dot{y}(t) = 0$. If $-\epsilon < e^{-\gamma t} \dot{y}(t) < \epsilon$ for $t > T$, what can you say about $y$?