How would I go about calculating the precision and accuracy of a given number?
For example 0.05 has an accuracy of 2 and a precision of 3. 1 has an accuracy of 0 and a precision of 1.
Is there an algorithm for calculating this?
How would I go about calculating the precision and accuracy of a given number?
For example 0.05 has an accuracy of 2 and a precision of 3. 1 has an accuracy of 0 and a precision of 1.
Is there an algorithm for calculating this?
You need to know the correct (or reference) value to get the accuracy of a number, which presumably is a measurement of something. So, in your question you've actually just stated the accuracy of those numbers. I'll take your word for it.
Precision is expressed usually as the finest gradation of your measurement. So, the precision for $0.05$ could be $0.01, 0.05,$ or even $0.025 = 1/40$. Again, it depends on how you're measuring.