Inverse Chebyshev Recurrence

1.4k Views Asked by At

The Chebyshev polynomials (of the first kind) are a sequence of polynomials defined recursively by $$ \begin{cases} T_{0}(x) = 1 \\ T_{1}(x) = x \\ T_{n}(x) = 2xT_{n-1}(x) - T_{n-2}(x) \end{cases} $$

I'll refer to Mathematica for all other properties, which I haven't needed so far.

Using this recurrence relation, it is easy to expand given lineair combinations of Chebyshev polynomials as linear combinations of powers of $x$. What I do in Matlab, is I choose an upper bound $n$ for the degree of polynomials I'll be working with and then write the conversion out explicitly in matrix form. For example, if $n = 4$, I'd have: $$ A = \begin{bmatrix} 1& 0& -1& 0& 1 \\ 0& 1& 0& -3& 0 \\ 0& 0& 2& 0& -8 \\ 0& 0& 0& 4& 0 \\ 0& 0& 0& 0& 8 \end{bmatrix} $$ where the columns are constructed from left to right using the recurrence relation. If $c$ is a vector of coefficients of a polynomial (of degree $\leq 4$) with respect to the Chebyshev basis, then its coefficients with respect to the monomial basis are simply given as $A \cdot c$.

If I want to go the other way around, that is, rewriting a linear combination of powers of x as a linear combination of Chebyshev polynomials, I can just invert this matrix. Needless to say, this fails numerically as soon as $n$ gets large enough (the elements on the main diagonal of $A$ are powers of 2). I was wondering if there is a better way to obtain this inverse matrix, perhaps by using some inverse recurrence formula?

Kind regards

Nicolas

4

There are 4 best solutions below

1
On

We may think to this problem in these terms: to write $T_n(x)$ as a linear combination of monomials is the same as expressing $\cos(nx)$ in terms of $\cos(x),\cos^2(x),\ldots$ On the other hand, to express $x^n$ in terms of $T_0(x),\ldots,T_n(x)$ is the same as expressing $\cos(x)^n$ in terms of $1,\cos(x),\cos(2x),\ldots$, i.e. to compute the Fourier cosine series of $\cos(x)^n$. That is an easy task through the binomial theorem, since:

$$ \cos^n(x) = \left(\frac{e^{ix}+e^{-ix}}{2}\right)^n = \frac{1}{2^n}\sum_{j=0}^{n}\binom{n}{j}e^{(n-2j)ix}\tag{1}$$ gives: $$ \cos^{2n}(x)=\frac{1}{4^n}\binom{2n}{n}+\frac{2}{4^n}\sum_{j=1}^{n}\binom{2n}{n+j}\cos(2jx)\tag{2}$$ so: $$ x^{2n} = \frac{1}{4^n}\binom{2n}{n}+\frac{2}{4^n}\sum_{j=1}^{n}\binom{2n}{n+j} T_{2j}(x).\tag{3}$$ The odd case is similar.

The solution to the inverse problem is given by the formula $(15)$ here: it can be achieved through the binomial theorem, too. This sheds some light on the relation between the Lagrange inversion formula and the inversion of structured matrices (Toeplitz matrices, in this case) through Cramer's rule or Gaussian elimination.

3
On

The matrix is in triangular form. The reversion is like a back substitution step in Gaussian elimination. Look at it this way: $$\begin{pmatrix}1&0&0&0&0\\0&1&0&0&0\\-1&0&2&0&0\\0&-3&0&4&0\\1&0&-8&0&8\end{pmatrix}\begin{pmatrix}x_1\\x_2\\x_3\\x_4\\x_5\end{pmatrix}=\begin{pmatrix}T_0\\T_1\\T_2\\T_3\\T_4\end{pmatrix}$$

You are solve this triangular system. Start from the first equation: $$x_1=T_0\\x_2=T_1\\x_3=\frac{1}{2}(T_2+T_0)\\\cdots $$

In Matlab, you can write a for loop to easily solve this.

And since the matrix is always triangular, even the inverse is not too bad.

6
On

Let $P(x)$ be the polynomial you want to be expressed as $\sum c_k T_k(x)$. The Chebyshev polynomials are orthogonal with respect to the following inner product: $$ \langle T_k, T_l\rangle \equiv \sum_{j=1}^n T_k(x_j) T_l(x_j) = \begin{cases} n, & k = l = 0\\ \frac{n}{2}, & 0 < k = l < n\\ 0, & k \neq l,\, k,l < n \end{cases} $$ Here $x_k, k = 1,\dots,n$ are roots of $T_n(x)$ polynomial, which have simple form (due to $T_n(x) = \cos n \arccos x$) $$ x_k = \cos \left(\frac{2k-1}{2n}\pi\right), \quad k = 1, \dots, n $$ Using this orthogonality you can express $c_k$ as $$ c_k = \frac{\langle T_k, P\rangle}{\langle T_k, T_k\rangle} $$

Edit. Also, if you care about numerical stability (especially when $n$ is big) you should not compute $T_n(x)$ by direct summation of $$ T_n(x) = \sum_{k=0}^n a_k x^k $$ but rather using explicit $$ T_n(x) = \cos n \arccos x $$ or using the recurrence relation $$ T_n(x) = 2xT_{n-1}(x) - T_{n-2}(x). $$ Even computing $$ T_n(x) = 2^{n-1}\prod_{k=1}^n (x - x_k) $$ is very sensitive to reordering of the multipliers (noticeable when $n \gtrsim 100$). The optimal reordering was given in Lebedev, V. I., and S. A. Finogenov. "Solution of the parameter ordering problem in Chebyshev iterative methods." USSR Computational Mathematics and Mathematical Physics 13.1 (1974): 21-41.

0
On

Thank you for your answers.

To try these methods, I've taken some of my problems (e.g. partial sums of Taylor series), converted them to the Chebyshev basis and then converted them back. For my purposes, I've found none of the given answers to be more effective than the Matlab inverse.

  • Jack D'Aurizio's answer, while interesting, provides little numerical stability.
  • KittyL's answer yields more or less the same results as the Matlab inverse, confirming that Matlab efficiently inverts diagonal matrices.
  • uranix' answer, against my own expectations, isn't stable either, not even for low degree polynomials (degree < 20)