I've have the following situation.
I am scanning a circular area target with a known radius and location. I receive back points that hit that target along with a location for the point. However, because my scan is not perfectly precise the location of the points I get back do not necessarily fall within that target. All I know is that they actual do come from hitting the target. I'd like to know how to calculate the precision of a single point based on my returned points. I can assume the the precision is the same in both x,y directions.
My assumption is that I should do the following:
a) Calculate the mean of the returned points
b) Calculate a variance based of the distance of each point from the mean
c) Subtract the variance of my actual points with the variance of points if they hit with equal probability on every point on the target.
d) The resulting variance will represent the variance of a single point.
Is this correct, is there a better way to do this?