How do i show that $f(x) = { 1\over x^2 + \ln^2x}$ is bounded?
Let $g(x) = x^2 + \ln^2x$.
The domain of $f(x)$ is $x > 0$, for the denominator we have that $x^2 > 0$ and $\ln^2x > 0$ hence ${ 1 \over g(x) } > 0$ for any $x$. Choosing large values for $x$ we have that $g(x)$ tends to infinity and hence $1 \over g(x)$ tends to $0$. Based on that observations we have that $f(x) > 0$.
But how do I show that the function has an upper bound? After plotting the graph it's clear that it exists, but I couldn't find a way to show that analytically. I'm not supposed to use derivatives but even if i could it gives that:
$$ f'(x) = -\frac{2(x^2 + \ln(x))}{x(x^2 + \ln^2(x))^2} $$
Then i need to solve for $f'(x) = 0$, which I assume is solvable only by using Lambert's W-function.
You can define a new continuous function $$ f(x)=\begin{cases} \dfrac{1}{x^2+(\ln x)^2} & x>0 \\[4px] 0 & x=0 \end{cases} $$ and consider that $f(1)=1$. Since $$ \lim_{x\to\infty}f(x)=0 $$ there exists $a>0$ so that $f(x)<1$ for every $x>a$.
The function $f$, restricted to the interval $[0,a]$, is continuous, so it assumes a maximum value $M\ge1$, which is also the maximum for $f$ over $(0,\infty)$.
Thus $0\le f(x)\le M$ for every $x\in(0,\infty)$ and so the function is bounded.