In an assignment, I was asked to numerically compute the value of an integral up to "six digits of accuracy".
What precisely does this accuracy requirement mean, mathematically speaking? What, for example, is the difference between 5 and 6 digits of accuracy?
It cannot just be the number of correct digits, because this would be impossible to determine with certainty without knowing the true value of the integral (which I do not know).
You are right in your statement, if you take the assignment literally. Different people might mean different things with "up to $x$ digits of accuracy", though. The literal definition relies on decimal notation, as you pointed out in some comments. Indeed, $(1)_{10} = (1)_2$ and $(3)_{10} = (11)_2$ (decimal and binary notation), so that in decimal notation $1$ and $3$ have no digit in common, but they do have one digit in common if you look at them in binary notation. Therefore, this cannot be the "correct" definition. To me, the definition that makes more sense is that
Remarks: