I have $n$ points $x_1,\dots,x_n\in\Bbb R^d$. Given a new point $y$, how can I verify that distance from $y$ to the convex hull of $x_i$ is less than a given $\varepsilon$? It's not important for me whether the distance is Euclidean, any $p$-norm would work. I am interested in efficient ways to compute this.
As requested, some details:
- I start building this cluster from a single point, and add points to it as soon as they lie within $\varepsilon$ distance from its convex hull. Otherwise, I start a new cluster and close the first one. Usually clusters are of reasonably small size (about $10^2$ or $10^3$ points), and the total number of unclustered points is $10^5$ or more.
- The points distribution is naturally clustered, so there are decent gaps between the data points.
- The new point may belong both to interior and to exterior of the cluster.
- The distance evaluation does not really have to be precise, that's one of the reasons I've mentioned which $p$-norm to use does not really matter to me.
without more indication about constraints, a basic solution would be to precompute a decomposition of the convex hull into a tiling of simplexs, then to compute the distance to the convex hull that is the min of distances from $y$ to each simplex.
If $y$ can't be inside the convex hull, simplications are possible, like, consider only the distance to the hypersurface of the convex hull.
Many algorithms exist to buid such tiling with different efficiency depending on your case.
Many optimizations are possible when evalutating the distance, e.g., precalculating a Binary space partitioning .