Note: I'm a programmer, not a mathematician - please be gentle. I'm not even really sure how to tag this question; feel free to re-tag as appropriate.
I'm using the Douglas-Peucker algorithm to reduce the number of points in polygons (in a mapping application). The algorithm takes a tolerance parameter that indicates how far I'm willing to deviate from the original polygon.
For practical reasons, I sometimes need to ensure that the reduced polygon doesn't exceed a predetermined number of points. Is there a way to predict in advance the tolerance value that will reduce a polygon with N points to one with N' points?
Here is a somewhat nontraditional variation of the Douglas-Peucker algorithm.
We will divide a given curve into pieces which are well approximated by line segments (within tolerance $\varepsilon$). Initially, there is only one piece, which is the entire curve.
It should be easy to see how to modify step 2 so that the algorithm produces exactly $n-1$ pieces, i.e. $n$ points, for any given $n$.
ExercisesThings I am too lazy to do myself: