What precisely are digits of accuracy when computing a numerical quantity?

3.6k Views Asked by At

In an assignment, I was asked to numerically compute the value of an integral up to "six digits of accuracy".

What precisely does this accuracy requirement mean, mathematically speaking? What, for example, is the difference between 5 and 6 digits of accuracy?

It cannot just be the number of correct digits, because this would be impossible to determine with certainty without knowing the true value of the integral (which I do not know).

3

There are 3 best solutions below

1
On BEST ANSWER

You are right in your statement, if you take the assignment literally. Different people might mean different things with "up to $x$ digits of accuracy", though. The literal definition relies on decimal notation, as you pointed out in some comments. Indeed, $(1)_{10} = (1)_2$ and $(3)_{10} = (11)_2$ (decimal and binary notation), so that in decimal notation $1$ and $3$ have no digit in common, but they do have one digit in common if you look at them in binary notation. Therefore, this cannot be the "correct" definition. To me, the definition that makes more sense is that

The approximation is said to be correct up to the $x-$th decimal digit if the absolute value of the difference between the true and the approximate result is smaller than $10^{-x}$ $(x \ge 0)$.

Remarks:

  • This is something that is indeed possible to determine, regardless of the shape of the actual result. And it is the only thing that really matters: we care about how big the error is, not about what the actual digits of the approximation are.
  • Decimal notation somehow "appears" in my definition too, but it is just because people work in decimal notation. It is a matter of convention, but computing in another notation would not influence the result, once we have agreed on what the maximal accepted error is.
  • This definition agrees with the intuitive one in most cases, but also works for your specific example.
0
On

To be sure, you should ask your instructor to confirm what they mean by their request. However, in my experience, asking for a certain number of digits of accuracy usually means the number of digits starting from the left, with this number often expressed in scientific notation (e.g., to avoid issues with numbers over $10^6$ having digits to the right of this but which are to the left of the decimal point). For example, $2.74691 \times 10^{8}$, $5.39871 \times 10^{1}$, $6.21091 \times 10^{-12}$, etc., are expressed using $6$ digits of accuracy.

As for not being able to determine these digits with certainty without knowing the true value, unless you have a lot of boundary digits near the end (e.g., $0$ or $9$) which can be changed with a small change in value, this is generally only true for the last digit if you have some way to bound the potential error so it is less than one-half of what the last digit represents, i.e., the last digit is more likely correct than any other last digit. For example, for $3.87215 \times 10^{-3}$ to be considered accurate to $6$ digits, the possible error should be less than $5.0 \times 10^{-3-6} = 5.0 \times 10^{-9}$.

8
On

I think the crux of your confusion is in the last paragraph, where you claim that the phrase "digits of accuracy"

...cannot just be the number of correct digits, because this would be impossible to determine with certainty without knowing the true value of the integral...

But this is plain wrong. When you were asked to determine the integral correct to an accuracy of six digits, in other words you were asked to approximate or estimate the integral, with a specified tolerance for error. In other words you want to bound the integral above and below using values that are correct to at least six digits. Then the estimated value of the integral would be within the specified range.

There are many ways for doing such things -- estimating an integral. One, for example, is Simpson's rule.