I have a 2D array (latitude, longitude) which I would like to convert/map to a 1D integer (i.e. for machine learning regression purposes).
I would then like be able to convert back from the 1D integer to the latitude/longitude pair (i.e. at the ML testing phase). The conversion doesn't need to be exact, it could be some approximation.
As -90 < lat < 90 and -180 < long < 180, and I had an idea of computing:
(lat + 90) * (long + 180)
in order to get some non-negative pseudo-unique integer, however, mapping back to the original (latitude, longitude) pair is problematic...
Is this possible or mathematically unfeasible?