Lagrange error for vector-valued linear interpolation?

32 Views Asked by At

If $f:[a,b]\to\mathbb R$ is a $C^2$ function, and $p_1(x)$ is the linear interpolation of $f(x)$ between $a$ and $b$, then as far as I know, we get the following Lagrange form of the error between $f$ and $p_1$:

  • For all $x\in[a,b]$, we have $f(x)-p_1(x)=\frac12 f''(\zeta_x) (x-a)(x-b)$ for some $\zeta_x\in[a,b]$.

If $f:[a,b]\to\mathbb R^2$ instead, then what is the closest analogue to this statement that we can get?

Surely an equality of $\mathbb R^2$ vectors is too much to hope for? Is the best we can do something like this:

  • For all $x\in[a,b]$, we have $\|f(x)-p_1(x)\|\leq\frac12 \|f''(\zeta_x)\| (x-a)(x-b)$ for some $\zeta_x\in[a,b]$.

Is that even true? If so, is there an easy proof or a standard reference that develops it?

Also, what conditions are there on the norm $\|\cdot\|$ to get this statement? Does it have to be an inner-product norm?

For comparison, there's a similar statement for a vector-valued mean value theorem in Rudin, Principles of Mathematical Analysis, Theorem 5.19, which is cited in Wikipedia here.