Metric for precision of a decimal number

75 Views Asked by At

I am working on cleansing a cities database and trying to increase the precision of the latitudes and longitudes of these cities. I am comparing what I have against an outside datasource and want to determine whether the latitudes and longitudes of any given city are likely more precise in our database or in the outside datasource.

So, for example:

  • 1.54323 is likely more precise than 1.5
  • 1.66591 is likely more precise than 1.6667
  • 1.81839 is likely more precise than 1.818181

Is there a simple mathematical way to determine the likely precision of a decimal number?

Clarified below

Maybe another way to approach this would be ask whether there is a simple way to reverse a decimal expansion.

  • .5 -> 1/2
  • .667 -> 2/3
  • .8181 -> 9/11

If I could get this far, then I could probably establish a rough precision metric simply by summing or multiplying the numerator and denominator.

(There might be a better tag for this. Feel free to add one)