I'm trying to find out more about how to calculate the slope of a curve at a point given a table of points (e.g. temperature vs time or whatever it may be).
Mr Google is unfortunately defeating me. I have found a vague one-liner answer on Quora (https://www.quora.com/How-can-the-slope-of-a-curve-at-a-given-point-be-calculated) that simply says:
If you have something like a table of points, then form the interpolation version and differentiate that.
Which I'm sure is detailed enough if you've got a mathematics degree, but what I really need is more of a layman's answer (preferably something I can do using R or another free tool .. although if its really easier in Mathematica or Matlab then I guess I'll have to consider my options).
My ultimate intended application is looking at slopes relative to another (e.g. differences in slopes of temperature vs time in one year/location/whatever vs another). I'm sure this bit is easy once the initial stumbling block above is conquered !
It depends on how noisy your data is and whether you want to interpolate or, in some sense, average.
If you data is not noisy, the easiest way is to use the approximation $f'(x) =\dfrac{f(x+h)-f(x-h)}{2h} $ which has an error $O(h^2)$.
If your data is noisy, you should probably use least squares of some form. Many years ago (over forty), I wrote a package for doing adaptive least squares cubic spline fitting, and it proved quite useful. I no longer have the source.