I'm working with a system where calculations like square roots tend to be expensive in time and introduce decimal number-related errors.
The idea is that I have a single reference 2-dimensional point, and being fed a list of hundreds of points, I need to show the closest $n$ points, but the actual distance between the points is irrelevant.
The points are always in the third quadrant $(x<0, y<0)$.
Using pythagorean distance:
$dist(a,b)=\sqrt{(x_a-x_b)^2 + (y_a-y_b)^2}$
The trivial thought of doing away with the square root and comparing the squares of the distances was a no-brainer, but I was wondering if there weren't any further simplifications I could make.
EDIT: I'm aware this has algorithm-related answers that could help me, but I posted here for information on the purely geometrical and numerical analysis side of things; if it still seems out of place I'll delete it.
Using squared distances is a good idea. If your points are integers you could store a table of squares. If the difference in one dimension is smaller than some threshold you might be able to ignore it.