Bounding linear interpolation error for bandlimited functions

82 Views Asked by At

Given a bandlimited signal $x$ with $\hat{x}(f) = 0$ for $|f|\geq B$, how can we bound the absolute error resulting from linear interpolation of its samples $x(kT)$, where $\frac{1}{T}\geq 2B$. Using the sampling theorem we get, $$x(t) = \sum_k x(kT)\text{sinc}\left(\dfrac{t-kT}{T}\right).$$ So what are good upper bounds on $$e(t) = |x(t)-\tilde{x}(t)|,\quad \text{where } \tilde{x}(t)=x(kT)+\left(x(k(T+1))-x(kT)\right)(t-kT), \quad \text{for } t\in\left[ kT, k(T+1)\right].$$ Can anyone help me here? Thanks in advance!

1

There are 1 best solutions below

0
On BEST ANSWER

Without more information the bound is infinite.

Now, if the signal is bounded as well, you can use this inequality to get a bound on the derivative of the signal: https://dsp.stackexchange.com/questions/51617/bounds-of-the-derivative-of-a-bounded-band-limited-function?rq=1

Using that bound as a maximal rate of growth, you can then draw two infinite conic sections starting from both points which intersection gives a upper and lower bound on the signal value between the two points. From there, you can get a lower and upper bound on your linear interpolation.