Under which assumptions on $f$ we can deduce that $g\in L^\infty$?

81 Views Asked by At

Let $f:\mathbb{R}^*_+\to\mathbb R$ be of class $C^\infty$ satisfying $$|f(x)|\le c_0 |\log(x)| \quad\text{ for } 0<x\ll 1,$$ for a positive constant $c_0>0$.

Let $c_1>0$ be a constant. As an exercise for my calculus II class, I need to find suitable assumption(s) on $f$ which guarantees that $$g(x):=1+\frac{1}{x^2} + [c_1 + \log(x)+ f(x)]^2 \in L^\infty.$$

$\textbf{MY ATTEMPT:}$ about me the problem is when approaching the point $0$. I think that if $f\equiv 0$, hence $g\not\in L^{\infty}$ since both $1/x^2$ and $\log(x)^2$ diverges when $x$ approaches to $0$.

Also, I think that under the assumption at first line, $g\not\in L^{\infty}$ as $$c_1 + \log(x)+ f(x) \le c_1 + \log(x) + c_0 |\log(x)|\quad \text{as } 0<x\ll 1,$$ so you get the same problem. I am pretty confident that actually the problem is about: "under which assumptions on $f$ do you have $g\in L^\infty$?"

Thus my questions are: do you think my previous argument hold true? and under which assumptions on $f$ it is possible to say $g\in L^\infty$?

Thank you in advance.