In school, I've learned numerous ways to compare sets of points to see how well the collection of data fits a curve, such as linear regression, and ways to compare sets of data to see how well they fit each other with methods like Chi-Squared. What is the most acceptable way to do the same for two curves? For example, in the picture I have attached, I am comparing blue (y=sin(2pix)) and green (y=sin(pix)-0.1) curves to black (y=sin(pi*x)) to see which is a better fit. If I were to sample every integer x value for the blue and black curves and compare them, I would find a perfect match, when in reality, the green curve matches much closer to the black curve for most other sampling resolutions. I understand this might be an edge case, but what is a reliable way to decide how close a curve matches another curve? Curves Image
2026-03-29 20:20:52.1774815652
On
How do I find the error of a curve rather than points.
36 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
1
On
Kullback-Leibler divergence (for non-negative functions):
$$\int\limits_{-\infty}^\infty f(x) \log \left( \frac{f(x)}{g(x)}\right)\ dx$$
As far as I know, there is no corresponding function for general continuous functions (that could be negative). The generalization of the sum-squared-error, though, would be proportional to:
$$\int\limits_{-\infty}^\infty (f(x) - g(x))^2\ dx$$
normalized.
The essence of least squares, or linear regression, is to minimize the $l^2$ (discrete case) or $L^2$ (continuous case) norm of the data. So, a measure of how well curve $f(x)$ approximates curve $g(x)$ is given by $$\int_{a}^{b} |f(x)-g(x)|^2 dx =: ||f-g||^2_{[a,b]}.$$ This measures the error on the interval $[a,b]$. So, the curve $f(x)$ which better approximates $g(x)$ would be the curve with the smaller norm.