Convergence of piecewise linear approximation of Bezier curve

312 Views Asked by At

I have a cubic Bezier curve and I approximate it with a line segment defined by first and last control point. I then have a way of evaluating how good or bad I approximate the original curve, call it an error, deviation or "flatness".

If I split the curve into two halves by De Casteljau's algorithm at $t = 0.5$, by how much does the error improves?

If I recursively split the curve into halves, what is the asymptotic rate the approximation improves?

1

There are 1 best solutions below

2
On BEST ANSWER

If $C(t)$ is the Bezier curve and $L(t)$ is the line, then an argument using a Taylor series with remainder shows that $$ \max\big\{\|C(t) - L(t)\|: a \le t \le b\big\} \le \frac18(b-a)^2M $$ where $$ M = \max\big\{\|C''(t)\|: a \le t \le b\big\} $$ Of course, $C''(t)$ is a Bezier curve of degree $1$, so its maximum norm is easy to estimate.

The first relation above says that the maximum error will be reduced by a factor of $4$ if you halve the interval $[a,b]$. However, note that this maximum error might not be exactly the one you want, because it doesn't really measure the geometric deviation between $C$ and $L$.