I have a set of points $(x_1, y_1), (x_2, y_2), ... (x_n, y_n)$ and a function $y=f(x)$ where $f$ is an interpolation of the dataset. Given $y_t$ - target $y$ coordinate and $\epsilon$ - required precision, how can I find a point $A(x_A, y_A = f(x_A)) : \lvert f(x_A) - y_t \rvert < \epsilon $ where $x_k \le x_A \le x_{k+1}, y_k \le y_t \le y_{k+1} $
Context: I'm working with chart.js library and I'm required to change the line style, when it crosses a certain horizontal border $y \ge y_t$. The library only allows to style line segments which are generated based on points in the input dataset. The solution is to generate additional points "exactly" on the border.
I can think about at least 2 approaches.
- Binary search. Assume $x_A = x_k + \frac{(x_{k+1} - x_k)}{2}$, calculate $y_A$, if not precise enough, depending on the sign of the error make $A$ either new upper or new lower border of the researched interval.
- Linearity assumption. Basically the same as binary search, but instead of taking the midpoint, I calculate $x_A = x_k + \frac{(x_{k+1}-x_k)(y_t-y_k)}{y_{k+1}-y_k}$
The question is, is there a way to tell, which approach is faster (requires less iterations)? Or are there any other methods?
I can guarantee that $f$ is monotonous on the interval $x_k .. x_{k+1}$