I am developing an algorithm that approximates a curve using a series of linear* segments. The plot below is an example, with the blue being the original curve, and the red and yellow being an example of a two segment approximation. The x-axis is time and the y-axis is attenuation in dB. The ultimate goal is to use the least amount of segments while keeping the maximum error below a certain value. Clearly, since some portions of the curve are more linear than others, some segments can be longer than others.
*The line segments are not actually linear, they are logarithmic. I am actually operating in log (dB) space, while the segments are linear in voltage.
My question is, how do I approach such an optimization problem? The method I am currently considering is to start at one end and select the longest segment possible, then move to the next segment and do the same. This will be good enough, in that it will keep me under my max error, but it may not be the best method. If it matters, I am working in Matlab.

An approach would be this one:
1) Start with only one segment that connects start and end point of the curve.
2) find the point on the curve where the approximation error is maximum.
3) if the error is below the tolerance: go to step 5; else: continue
4) add one point exacly where the error is maximun, thus breaking the corrisponding segment into two new segments of lower error. go to step 2.
5) find the segment with the maximun height in y
6) if the height is below the limit: STOP; else: break the segment in half and go to step 5.