Let $f$ be defined on $(a,b)$. If $$f'+f^2+1 \ge 0,$$ $$\lim_{x\to a}f(x)=\infty$$ and $$\lim_{x\to b}f(x)=-\infty$$ Find an interval where $b-a$ might be.
2026-04-29 20:19:04.1777493944
Find interval of defintion
51 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
We have
$$\frac{df}{dx} + f^2 + 1 \ge 0 \implies \frac{df}{f^2+1} \ge -dx$$
Integrating this on $[0,x]$ gives $f(x) \ge -\tan x$.
Now we have
$$-\infty = \lim_{x\to b-}f(x) \ge -\lim_{x\to b-} \tan x$$
so $\lim_{x\to b-} \tan x = +\infty$ which implies $b = \frac{(2k+1)\pi}2$ for some $k \in \mathbb{Z}$.
Now, if $b-a > \pi$, there would exist $t \in (a,b)$ such that $\lim_{x\to t+} \tan x = -\infty$ so $f(t) \ge -\lim_{x\to t+} \tan x = +\infty$ which is a contradiction because $f(t)$ is a well-defined real number.
Hence $b-a \le \pi$.
I believe this is the most we can say about $b-a$.