Find the length of a curve specified by a series of polar co-ordinates.

33 Views Asked by At

I have a curve defined by a series of polar co-ordinates, $P_a(r_a,\theta_a)$ through $P_b(r_b,\theta_b)$. I would like to determine the length of this curve.

Because the points are from semi-random, real world measurements I have the complication that $r$ is not an apparent function of $\theta$. Also the measurements are simply numbered $a=1 \ldots b=n$ and are not known to be evenly spaced, thus neither $r$ nor $\theta$ is a function of the numbering parameter $t$.

I thought of trying to do it as a vector valued function $r(t) = \langle f(t),g(t) \rangle$ but there is no known relationship between the Cartesian equivalent coordinates and the numbering parameter $t$.

Work so far

1

There are 1 best solutions below

0
On BEST ANSWER

I suppose you could convert your $(r_k,\theta_k)$ measurements into Cartesian $(x_k, y_k) = (r_k\cos\theta_k, r_k\sin\theta_k)$ values and then interpolate an appropriate polynomial or some other function that might make sense for the specific problem you're trying to solve. Once you have the interpolation function $y = f(x)$ then you can work with that, instead, to obtain an approximation to the length of the original curve.