Good day mathematicians,
I'm a computer engineer working on an automation project. Right now I have measured an area surface of cm2 which I am plotting on the y-axis. On the x-axis I have the time which starts at 0 and increments by 1.
With my data, I want to determine the speed (area/s) at all given x-intervals and plot that on a new graph.
What I have tried: Keep adding every single y-value and divide it by the current x-value. So:
x: 1, y: 5 x: 2, y: 8 x: 3, y: 10
My speed at x3 would be (5 + 8 + 10)/3 but it does not seem quite right. The graph I am plotting is slowly nearing 1 (I forgot the name of this phenomenon, but I know it has a name) even though it the y values are far and few between. What is going on, and how can I solve this? Some colleague talked about using differentials.
I'm not sure I totally understood your problem, but let me try it.
From what I got, you have a growing area you keep mesuring at regular intervals, and from this data, you want to plot the speed1 at which it grows.
Speed, in this context can mean 3 things
Please note that this is the average speed between t=2 and t=3. So you have to decide whether to place this data at 2 or 3 in your new graph.
1 Disclaimer: I'll avoid using "speed" for anything other than distance/time, but "growth rate" seems economy-conoted, any suggestion is welcomed.