This type of function comes up every now and then during my projects. Often and in performance critical parts enough that I'd like to learn more about it and see how others have implemented it.
Essentially, the purpose is to find out how linearly close 2 values are on a rotating scale of 0 to 1
For example:
0.5 and 0.5 = 1, maximum close
0.5 and 0.4 = 0.8, pretty close
0.5 and 0 = 0, least close
0.5 and 1 = 0, also least close
0.8 and 0.5 = 0.4
0.8 and 0.3 = 0
etc.
My implementation in js is as follows:
find_closeness(pivot, value) {
return 1 - Math.min(Math.abs(value - pivot), Math.abs(value - pivot - 1), Math.abs(value - pivot + 1)) * 2;
}
But I'm sure theres a more efficient way to do it, math isn't my strong suit. I call the variables pivot and value because thats how I visualize it.
What is this operation called?
EDIT: previously posted implementation had a bug, this older version still works though
This seems to just be equal to $$1-2\times\text{Math.Min}(\text{Math.Abs}(\text{pivot}-\text{value}),0.5)$$ but I'm confused by your example of $0.8,0.5\to0.6\ne0.4$.