I am trying implement a function that could fit an elliptic fourier curve on a set of border points of a detected object. I am using cv2.findContours to acquire border points from a binary image. Next I would like to calculate the elliptic Fourier coefficients via equation:
(for sake of simplicity I will only address the x axis)
here is the equation :
$$ a_n = \frac{1}{n^2 \pi} \sum_{p = 1}^q \frac{\Delta x_p}{\Delta t_p} \left[ \cos{n t_p} - \cos{nt_{p-1}} \right] $$ and $$ b_n = \frac{1}{n^2 \pi} \sum_{p = 1}^q \frac{\Delta x_p}{\Delta t_p} \left[ \sin{n t_p} - \sin{nt_{p-1}} \right] $$
And here comes my question: The idea is to parametrise the x coordinates from 0 to 2*π. My question is, if Δt should be a constant or should it be dependant to the Δx (the bigger the change in x coordinate, the bigger Δt).