Understanding error measure for 2D curve offsetting

26 Views Asked by At

I'm reading a paper on 2D curve offsetting and can't wrap my head around a measure used to describe an error between $C_r(t)$, the offset curve of $C(t)$, and $C_a^r(t)$, the curve that approximates it.

It states that given $C_a^r(t)$, the approximation error can be computed as such

$$e(t) = (C_a^r(t)\, - \,C_r(t)) \,\cdot \, N(t)$$

Where $N(t)$ is the normal vector of $C(t)$ at time $t$ and $\cdot$ denotes the dot product.

Now, if what we want is $C_a^r(t)$ to be as close as possible to $C_r(t)$, and to have a normal vector $N_a^r(t)$ as close as possible to $N(t)$, then I would expect the error measure to get smaller when either the magnitude of $C_a^r(t)\, - \,C_r(t)$ shrinks and/or its angle with $N(t)$ gets smaller. However, what I get from $e(t)$ is that it equals the length of the projection of $C_a^r(t)\, - \,C_r(t)$ onto $N(t)$, which makes me think that for example, if the angle grows and the magnitude stays the same, then the $e(t)$ decreases.

I'm sure I'm missing something bad here, so any help is appreciated. Thank you!