$L^2$ vs $L^{\infty}$ norm for interpolation

337 Views Asked by At

Under what circumstances should I consider utilizing the $L^2$ norm instead of $L^{\infty}$ when interpolating a function based on sample points?

Probably related: Does the answer significantly differ depending on the class of approximation functions (polynomials, splines, etc.) I use?

While I know the underlying function to be differentiable, the only sample point data I have is $x$ and $f(x)$, i.e. no values of the derivative for any given point.

(Sorry if this is a bit open, I will try to narrow down the question based on your responses.)