Integrate angle subtended by a cubic spline

64 Views Asked by At

Given a parametric curve defined by:

$$ \mathbf{r}(t)= \begin{bmatrix} x(t)\\y(t)\\z(t) \end{bmatrix} =\begin{bmatrix}a_0+a_1t+a_2t^2+a_3t^3 \\ b_0+b_1t+b_2t^2+b_3t^3 \\ 1 \end{bmatrix} $$

Let $d\theta$ be the infinitesimal angle at the origin subtended by $d\mathbf{r}$.

(i.e. angle between $\mathbf{r}$ and $\mathbf{r}+d\mathbf{r}$.)

I am seeking $\int d\theta $ w.r.t. $t$.

It seems to me that

$$ d\theta = \frac{\sqrt{(dx)^2+(dy)^2}}{1+x^2+y^2} $$

so the integral I seek would be

$$ \int \frac{\sqrt{(x'(t))^2+(y'(t))^2)}}{1+(x(t))^2+(y(t))^2} dt $$

where $x$, $y$, $x'$, $y'$ are polynomials in $t$, expressed in terms of $a_0$, $a_1$, $a_2$, $a_3$, $b_0$, $b_1$, $b_2$, $b_3$ but I am hoping for a trick/technique that makes this integral tractable.