For a robotics project, we have sensors that give us an accurate realworld XYZ position. I've recorded this data to measure the sag our telescoping boom has over its 50 feet of travel. This is the raw data plotted.
The data is non-uniform, with random number of points depending on how fast the robot was moved. Data points also may repeat if motion stopped while recording. While it generally shows a very nice path, we'd like to smooth out some of the bumps in it.
We currently manually overlay a Bezier curve to fit this to use as sag compensation, but we'd like to automate this process in our application. Not looking for code, just the process behind getting the data.
The end goal is to break it down into 100 equally spaced points along the curve that we will linearly interpolate between. I need the distance between points along the curve to be equal, not the distance along the horizontal axis.
What would be a good way of going about this?
I'm thinking I could loop through the data calculating the distance between each point until I find one that's close to my desired distance, then take an average of the surrounding samples. I can see a lot of problems with that though and figure there's a better mathematical solution to this.
Data:
X,0,0,0,0,-0.22,-0.76,-1.74,-3.25,-5.39,-8.17,-12.13,-16.05,-20.73,-25.53,-30.25,-34.87,-40.12,-45.16,-50.19,-55.23,-60.27,-65.53,-70.31,-75.35,-80.37,-85.42,-90.46,-95.47,-100.73,-105.51,-110.53,-115.3,-120.59,-125.61,-130.86,-135.87,-140.88,-145.65,-150.93,-155.72,-160.72,-165.75,-170.74,-176.02,-181.02,-186.06,-191.33,-196.37,-200.88,-206.16,-211.41,-215.92,-221.19,-226.22,-231.23,-235.99,-241.25,-246.26,-251.02,-256.28,-261.32,-266.33,-271.57,-276.32,-281.35,-286.62,-291.63,-296.66,-301.43,-306.7,-311.45,-316.73,-321.73,-326.75,-331.76,-336.53,-342.05,-346.57,-351.55,-356.56,-361.82,-366.83,-372.1,-376.87,-381.88,-386.89,-392.13,-396.91,-401.91,-406.94,-411.96,-417.22,-421.96,-426.98,-432.24,-437.02,-442.03,-447.29,-452.28,-457.27,-462.29,-467.3,-472.32,-477.31,-482.07,-487.34,-492.09,-497.1,-502.33,-507.11,-512.37,-517.39,-522.4,-527.41,-532.15,-537.4,-542.16,-547.43,-552.24,-556.51,-561.39,-565.58,-569.61,-573.36,-577.59,-581.44,-584.25,-587.1,-590.1,-592.49,-592.41,-592.4,-592.36,-592.36,-592.35,-592.35,-592.34,-592.34,-592.33,-592.32,-592.3,-592.3,-592.29,-592.29,-592.29,-592.27,-592.26,-592.27,-592.26,-592.25,-592.25,-592.25,-592.24,-592.24,-592.24,-592.24
Y,0,0,0,0,0,0,0,0,0.01,0.02,0.03,0.04,0.05,0.06,0.07,0.08,0.09,0.1,0.11,0.12,0.14,0.14,0.14,0.15,0.15,0.16,0.16,0.16,0.16,0.16,0.16,0.16,0.14,0.13,0.13,0.1,0.01,0,-0.01,-0.02,-0.04,-0.05,-0.06,-0.07,-0.1,-0.11,-0.13,-0.16,-0.18,-0.2,-0.23,-0.25,-0.28,-0.31,-0.34,-0.38,-0.42,-0.46,-0.5,-0.55,-0.61,-0.66,-0.72,-0.79,-0.87,-0.95,-1.03,-1.11,-1.19,-1.29,-1.37,-1.47,-1.56,-1.65,-1.75,-1.85,-1.96,-2.05,-2.16,-2.28,-2.41,-2.53,-2.68,-2.81,-2.96,-3.11,-3.27,-3.41,-3.55,-3.7,-3.85,-4.02,-4.16,-4.32,-4.5,-4.67,-4.86,-5.07,-5.29,-5.49,-5.69,-5.92,-6.12,-6.34,-6.57,-6.79,-7,-7.24,-7.48,-7.7,-7.99,-8.24,-8.49,-8.76,-9.01,-9.29,-9.58,-9.88,-10.17,-10.46,-10.74,-11.02,-11.31,-11.57,-11.85,-12.15,-12.39,-12.57,-12.78,-12.96,-13.01,-12.97,-12.98,-13.03,-13.01,-12.99,-13,-13.02,-13,-12.99,-13.02,-13,-12.99,-13.02,-13.01,-13,-13.01,-13.01,-13,-13.01,-13.01,-13.01,-13.01,-13.01,-13.01,-13

Your data is too erratic at the very beginning and end of the interval to perform a meaningful interpolation. Therefore, of the original 156 points, I used only points 5:130, for a total of 126. Then I used a simple linear interpolation with a new evenly spaced range of $x$-values to calculate the new $y$-values, conceptually $[X,Y]\mapsto[x,y]$. The figure below shows the original data $[X,Y]$ in red dots and the mapped data $[x,y+1]$ in blue dots (the +1 being for clarity).