I've been asked to calculate a linear fit (of data that is approximately linear) using a certain database but the closest thing I can do is calculate the average derivative (over time) e.g. I have 10K points that again when plotted look approximately linear. I'm looking for a way to explain in plain english the difference between these two calculations.
My understanding that if the data is perfectly linear then they are the same but when it starts "curving" then the average derivative is more sensitive and therefore a worse fit.
Sample data (see picture) x,y,derivative (1,1,N/A) (2,2,1) (3,3,1) (4,5,2) (5,8,3)
Average derivative: 1.75 Slope of linear fit (Excel) 1.7
