I had previously asked this question on https://gis.stackexchange.com/, but it seems like they don't usually handle "pure math" questions, and it was poorly received, so I closed it.
PostGIS and SpatiaLite can compute the distance between a point $P$ and polygon $A$ using the ST_Distance() function, as described in https://gis.stackexchange.com/q/266555/57394 (among many other places).
My question is: how is this achieved, mathematically?
My general understanding of the algorithm is:
- Find the shortest line $L$ between $P$ and the boundary of $A$
- Compute the length of $L$
In the case of a coordinate system like WGS84, $L$ is presumably a segment of a great circle, so finding its length is relatively straightforward.
What's not clear to me however is how to find $L$. I would assume that the best way to do it is to express the length of $L$ as a parametric function of its endpoint, and then minimize that length with respect to the appropriate constraint on that endpoint. But that would imply already knowing the length of $L$ as a result of of finding $L$, so the technique used by SpatiaLite must be different.
I tried looking at the SpatiaLite source code to see if I could figure it out. I did find what seemed to be the implementation (specifically gaiaShortestLineCommon() in src/gaiageo/gg_relations_ext.c in libspatialite v5.1.0). But I had a hard time following the logic, which is fairly long and uses a large number of GEOS library routines.
It would be great to have an explanation of how this is accomplished in mathematical terms, without getting distracted by computational details, memory management, learning the GEOS library, etc.