In my textbook, there is the following theorem:
For all $x>0$, we have $$\psi(x)=\sum_{\alpha=1}^\infty\theta(x^{1/\alpha})$$ and hence $$\psi(x)=\theta(x)+O(\sqrt x\log x).$$
Here $\theta(x)=\sum_{p\le x}\log p$ and $\psi(x)=\sum_{p^\alpha\le x}\log p$ are the Chebyshev functions. The first formula is proven with a little sum manipulations, but the second formula is not explained.
My guess is the author is writing $\sum_{\alpha=1}^\infty\theta(x^{1/\alpha})=\theta(x) +\sum_{\alpha=2}^\infty\theta(x^{1/\alpha})$, and the leading term of the remainder is $\theta(\sqrt x)$. Since this is introductory material we don't have the PNT, so we don't know $\theta(x)\sim x$, but a simple estimate gives $\theta(x)\le x\log x$ so the leading term is indeed $\sqrt x\log(\sqrt x)=\frac12\sqrt x\log x$. But it is not obvious to me that the other terms, of order $x^{1/3}\log x$, $x^{1/4}\log x$ etc. don't necessarily overwhelm the first term, if there are enough of them. Again, a naive bound on $\alpha$ gives $\alpha\le \log_2 x$, but this is much too weak if you just take $\log x$ copies of $x^{1/2}\log x$, and I don't think $\sum_\alpha x^{1/\alpha}$ converges.
We can do the following elementary estimate: \begin{align*} \psi(x)-\theta(x) & =\sum_{p^m\le x}\log(p)-\sum_{p\le x}\log(p)\\ & =\sum_{p\le \sqrt{x}}\log(p)\sum_{2\le m\le\frac{\log(x)}{\log(p)}}1 \\ & \le \sum_{p\le \sqrt{x}}\log(p)\Bigl[\frac{\log(x)}{\log(p)} \Bigr] \\ & \le \sqrt{x}\log(x). \end{align*}