Error estimation of a Taylor polynomial

50 Views Asked by At

Suppose $y : \mathbb{R} \rightarrow \mathbb{R}$ is a signal admitting a Taylor series expansion around zero (Maclaurin series) \begin{equation} y(t) = \sum_{n=0}^\infty \frac{y^{(n)}(0)}{n!}t^n \end{equation} and the Taylor polynomial \begin{equation} \widetilde{y}(t) = \sum_{n=0}^m \frac{y^{(n)}(0)}{n!}t^n \end{equation} is a truncated Taylor series. Then the error of the interpolation is equal to \begin{equation} \widetilde{e}(t) = y(t) - \widetilde{y}(t) = \sum_{n=m+1}^\infty \frac{y^{(n)}(0)}{n!}t^n = \mathcal{O}(t^{m+1}) \end{equation} in big O notation. Suppose the signal is sampled at another point $( T,\,y(T) )$. The interpolation polynomial \begin{equation} \hat{y}(t) = \widetilde{y}(t) + \frac{y(T) - \widetilde{y}(T)}{T^{m+1}}t^{m+1} \end{equation} satisfies \begin{equation} \hat{y}(T) = y(T),\;\hat{y}^{(n)}(0)=y^{(n)}(0),\quad n=1,2,\dots,m \end{equation} I want to use this polynomial for the estimation of the error for the Taylor polynomial \begin{equation} \lVert\widetilde{e}(t)\rVert\approx\lVert \hat{y}(t) - \widetilde{y}(t)\rVert \end{equation} This can be justified if the error $\lVert\hat{e}(t)\rVert \ll \lVert\widetilde{e}(t)\rVert$, i.e. if \begin{equation} \hat{e}(t) = \mathcal{O}(t^{m+2}) \end{equation} I think the previous statement is true, but I am not sure how to show it. It can be shown that \begin{equation} \hat{e}(t) = y(t) - \hat{y}(t) = \left(\frac{y^{(m+1)}(0)}{(m+1)!}-\frac{y(T) - \widetilde{y}(T)}{T^{m+1}}\right)t^{m+1} + \sum_{n=m+2}^\infty \frac{y^{(n)}(0)}{n!}t^n = \mathcal{O}(t^{m+1}) \end{equation} I am asking for the help proving or disproving that the error of the latter polynomial is $\hat{e}(t) = \mathcal{O}(t^{m+2})$.

1

There are 1 best solutions below

0
On BEST ANSWER

I am going to assume for simplicity that $0 \leq t < T$. I can show that the error is $\mathcal{O}\left(T^{m+2}\right)$. This is a different statement from the one posed in the question, but is very useful for me. I hope it will be useful somebody else reading this answer. I think the problem in my reasoning was that I the latter interpolation has the extra parameter $T$ compared to the Taylor polynomial extrapolation.

I am going to accept my own answer, but I will appreciate any suggestions and critiques.

I am going to continue the calculation \begin{align} \hat{e}(t) &= y(t) - \hat{y}(t)\\ &= \left(\frac{y^{(m+1)}(0)}{(m+1)!}-\frac{y(T) - \widetilde{y}(T)}{T^{m+1}}\right)t^{m+1} + \sum_{n=m+2}^\infty \frac{y^{(n)}(0)}{n!}t^n\\ &=\left|\begin{array}{l}y(T)=\sum_{n=0}^\infty \frac{y^{(n)}(0)}{n!}T^n\\\widetilde{y}(T)=\sum_{n=0}^m\frac{y^{(n)}(0)}{n!}T^n\\\end{array}\right|\\ &= \left(\frac{y^{(m+1)}(0)}{(m+1)!}-\sum_{n=m+1}^\infty\frac{y^{(n)}(0)}{n!}\frac{T^n}{T^{m+1}}\right)t^{m+1} + \sum_{n=m+2}^\infty \frac{y^{(n)}(0)}{n!}t^n\\ &= \left(\frac{y^{(m+1)}(0)}{(m+1)!}-\frac{y^{(m+1)}(0)}{(m+1)!}\frac{T^{m+1}}{T^{m+1}} -\sum_{n=m+2}^\infty\frac{y^{(n)}(0)}{n!}\frac{T^n}{T^{m+1}}\right)t^{m+1} + \sum_{n=m+2}^\infty \frac{y^{(n)}(0)}{n!}t^n\\ &= \sum_{n=m+2}^\infty-\frac{y^{(n)}(0)}{n!}T^n\left(\frac{t}{T}\right)^{m+1} + \frac{y^{(n)}(0)}{n!}t^n \end{align} The error norm or all $0 \leq t < T$ is \begin{align} \left\lVert\hat{e}(t)\right\rVert &\leq \sum_{n=m+2}^\infty\left\lVert\frac{y^{(n)}(0)}{n!}\right\rVert T^n\left(\frac{t}{T}\right)^{m+1} + \left\lVert\frac{y^{(n)}(0)}{n!}\right\rVert t^n\\ &\quad\leq \sum_{n=m+2}^\infty\left\lVert\frac{y^{(n)}(0)}{n!}\right\rVert T^n\left(\frac{T}{T}\right)^{m+1} + \left\lVert\frac{y^{(n)}(0)}{n!}\right\rVert T^n\\ &\quad\quad=\sum_{n=m+2}^\infty2\left\lVert\frac{y^{(n)}(0)}{n!}\right\rVert T^n = \mathcal{O}\left(T^{m+2}\right) \end{align}