$\lim\limits_{t\to\infty}t-x(t)=0\ ?$

51 Views Asked by At

Let $\displaystyle\cases{ x'=\frac{t-x}{1+t^2+x^2} & \cr x(1)=1 }$ be the Initial value problem, prove or disprove $\lim\limits_{t\to\infty}t-x(t)=0$

We've already proved that: for $t>1, x(t)<t$, but it is also true that $x'>0\iff x<t$, so the question is legitimate, we're not allowed to solve the problem, and read it from the result. Wolframalpha says no, I hope you can help.

2

There are 2 best solutions below

2
On BEST ANSWER

Let $u(t) = x(t) - t$. Then the equation becomes

$$ \left\{ \begin{array}{c} \frac{du}{dt} = 1 - \frac{u}{1+2t^2 + 2tu + u^2}\\ u(1) = 0 \end{array} \right. $$ Then since the denominator $1+2t^2 + 2tu + u^2$ is always positive (and in fact greater than 1),$$\lim_{t \rightarrow \infty} u(t) = 0 \ \Rightarrow \lim_{t \rightarrow \infty} \frac{du}{dt} = 1$$ which contradicts various theorems about ilmits and derivatives; pick the one you have learned in your course.

So the limit of $t-x$ cannot be zero.

2
On

Hint: if $\lim t-x(t)=0$, then there exist "arbitrary large values" such that $x'(t)\geq 1/2$. However for sufficiently large $t$, if $t-x(t)\leq 1$ (which forces $x(t)\geq t/2$), then $x'(t)\leq 1/(1+t^2+t^2/4)\ll 1/2$...