I've been going over some topics in number theory that are at times quite difficult to grasp when I understand full well where the main terms of the equations come from, but then I don't understand how they arrived at the error terms, or why.
The definitions of big O, little o and a couple of other well-recognized equivalence relations are all defined in the opening of Hardy and Wright's Introduction to the Theory of Numbers. But they don't spend much time on it. Other authors are similar in their treatment of error terms, where the reader is expected to understand immediately the source of the error, the magnitude of the error, and the justification for utilizing such a bounded error term.
So I was wondering if the community could point me in the direction of a good reference about error bounds so I can figure out how they are properly utilized.
Apostol's Introduction to Analytic Number Theory is quite good. The first several chapters serve as an extended introduction towards error notations and asymptotic arguments. Note: the book does expect a certain familiarity with calculus.