Let $\displaystyle\cases{ x'=\frac{t-x}{1+t^2+x^2} & \cr x(1)=1 }$ be the Initial value problem, prove or disprove $\lim\limits_{t\to\infty}t-x(t)=0$
We've already proved that: for $t>1, x(t)<t$, but it is also true that $x'>0\iff x<t$, so the question is legitimate, we're not allowed to solve the problem, and read it from the result. Wolframalpha says no, I hope you can help.
Let $u(t) = x(t) - t$. Then the equation becomes
$$ \left\{ \begin{array}{c} \frac{du}{dt} = 1 - \frac{u}{1+2t^2 + 2tu + u^2}\\ u(1) = 0 \end{array} \right. $$ Then since the denominator $1+2t^2 + 2tu + u^2$ is always positive (and in fact greater than 1),$$\lim_{t \rightarrow \infty} u(t) = 0 \ \Rightarrow \lim_{t \rightarrow \infty} \frac{du}{dt} = 1$$ which contradicts various theorems about ilmits and derivatives; pick the one you have learned in your course.
So the limit of $t-x$ cannot be zero.