I have seen in a paper that, if $A=\mathcal{O}(p^2)$ and $B=\mathcal{O}(p)$ then,
how can we say that, $A^{-1/2}B$ is diverging?
The way I thought is,
if $A = \mathcal{O}(p^2)$, then $A^{-1/2}$ = $\mathcal{O}(p^{-1})$, then $A^{-1/2}B$ = $\mathcal{O}(p^{-1})\mathcal{O}(p) = \mathcal{O}(1)$. If so, we can't say it is diverging?
Any help is greatly appreciated.
We say that $f = O(g)$ (as $x \to \infty$) if there exists a constant $M > 0$ and a point $x_0 \in \mathbb{R}$ such that $$|f(x)| \le M | g(x) |$$ for all $x > x_0$. This just gives us upper bounds. Take for instance $A(p) = 1$ (a constant function) and $B(p) = 1/p$, then trivially we have $A = O(p^2)$ and $B = O(p)$. However $A^{-1/2} B = 1/p$, and converges to 0 as $p \to \infty$.