Hello! I'm in the process of making a very basic racetrack simulator, for this I am currently fitting a piecewise polynomial to a set of points representing the "raceline" of the track, this then allows me to calculate the minimum radius of curvature of the track and thus the minimum speed I can take around a corner. It is working fine so far like this - but I've hit a snag. My radius of curvature varies significantly depending on how fine I interpolate my track. Ideally, I want something as true to real life (and with continuous curvature) as possible, and so I was hoping that if I could fit a degree $n$ bounded (circular) polynomial to my data, I'd be able to get a more accurate estimate of the minimum radius of curvature, and it would also simplify my calculations significantly, as it is easily differentiable. How would I go about fitting this type of polynomial to a set of $(x, y)$ coordinate data? I feel out of my depth.
$$ f(x, y) = \sum_{n=0}^{n} a_{(2n)2}y^{2n} + a_{(2n)1}x^{2n} + a_{(2n-1)2}y^{2n-1} + a_{(2n-1)2}y^{2n-1}$$
Am I needlessly over-complicating things? I feel like this would be something I would calculate once per track, and thus smooth out any errors in my track data.
Track processing image:
