Computing the error of an approximation

22 Views Asked by At

I had an idea to approximate a function $f(t)=\frac{v(t)}{k}$ where I now want to find out how big the error is. My idea was to approximate the function by a polygonal chain: The time $t$ gets parted in equidistant intervals with length $\tau$, such that in the intervals $[\tau_i,\tau_{i+1}]$ I have constant functions $f_i= \frac{v_i}{k}$, with $v_i=\frac{v(\tau_{i+1})-v(\tau_i)}{\tau}$.

Now I am interested to find out the error or the distance of $f$ and $f_i$ in the intervals and how to express it in terms of $\tau$.