Alright, I am super rusty in my Calculus, but I am trying to help a friend with the following question. How do we get started?
You're measuring the velocity of an object by measuring that it takes 1 second to travel 1.2 meters. The measurement error is .001 meters in distance and the error in time is .01 second.
What is the absolute value of the error in the linear approximation for the velocity?
Velocity is distance/time. What is the maximum distance possible? What is the minimum? What is the maximum time? What is the minimum? How do you combine these to get the maximum velocity? The minimum?