Generalized Chebyshev Polynomials and trigonometric identities

157 Views Asked by At

The usual Chebyshev polynomials of the first kind are defined by the recursive relation: $$T_0(x)=1,\\ T_1(x)=x,\\ T_{n+1}(x)=2xT_n(x)-T_{n-1}(x)$$ One can use the same recursive relation to define the Chebyshev Polynomials of the second kind, by slightly changing the initial condition: $$U_0(x)=1,\\ U_1(x)=2x,\\ U_{n+1}(x)=2xU_n(x)-U_{n-1}(x)$$

An alternative (but equivalent) way to define the Chebyshev Polynomials is through the trigonometric identities: $$T_n(\cos(\theta))=\cos(n\theta)\\ U_n(\cos(\theta))\sin(\theta)=\sin((n+1)\theta)$$

I am interested in considering "generalized" Chebyshev Polynomials, which are defined through the same recursive relation: $$P_{n+1}(x)=2xP_n(x)-P_{n-1}(x)$$ where this time the initial condition can be more general ($P_0$ would be some constant and $P_1$ would be some polynomial of degree one). Given a generalized Chebyshev Polynomial, I am interested to know whether it can also be defined through some nice trigonometric identity, similar to $T_n$ and $U_n$. My question is:

  1. Does anyone have an idea on how to approach such a problem? I can just plug in $\cos(\theta)$ as $x$ and try to see what I can get, but perhaps there is a smarter (less ad hoc) way, which does not require me to fix a specific initial condition.

  2. Has this been done somewhere? Even for specific cases (other than the two I stated). It seems likely that this problem has been visited in the past, so it'd be good to know if there are some relevant existing results.

Thanks in advance.

1

There are 1 best solutions below

2
On BEST ANSWER

In general, if you have a recurrence relation of the form $$ a_0, \cdots, a_{n-1} \text{ arbitrary }, \quad a_n = \sum_{i=1}^{n} c_i a_{n-i}, \quad c_i \in R $$ (where $R$ is some ring (commutative unital let's say)), the set of sequences $a_n$ satisfying that recurrence relation is a free $R$-module of dimension $n$; that's just because there's always a unique solution given any set of initial parameters.

I'm saying this because you're using the same recurrence relation in both sequences $T_n$ and $U_n$, the difference being that you're working with the ring $\mathcal R$ of (let's say) analytic functions in one variable $x$ over the real numbers (polynomials and trigonometric functions are analytic). By analytic I mean admitting a convergent Taylor expansion everywhere.

For initial parameters $a_0, a_1 \in \mathcal R$, let $C(a_0,a_1)_n$ denote the corresponding solution to your recurrence relation, so that $C(a_0,a_1)_n$ is characterized by $$ C(a_0,a_1)_0 = a_0, \quad C(a_0,a_1)_1 = a_1, \quad C(a_0,a_1)_n = 2x C(a_0,a_1)_{n-1} - C(a_0,a_1)_{n-2}. $$ Notice that for $a_0,a_1,b_0,b_1 \in \mathcal R$, we have $$ C(a_0,a_1)_n + C(b_0,b_1)_n = C(a_0 + b_0, a_1 + b_1)_n $$ and for $\lambda \in \mathcal R$, $$ C(\lambda a_0, \lambda a_1)_n = \lambda C(a_0,a_1)_n $$ by $\mathcal R$-linearity of the condition imposed on $C$ (this is what I meant by $\mathcal R$-linearity; I do not mean $\mathbb R$-linearity!).

We have $$ C(1,x)_n = T_n, \qquad C(1,2x) = U_n. $$ If $f,g \in \mathcal R$ are arbitrary, the $\mathcal R$-linearity shows that $$ C(f,g)_n = f C(1,0)_n + g C(0,1)_n, $$ so it suffices to compute only those.

We see that as vectors in $\mathcal R^{\oplus 2}$, $$ 2(1,x) - (1,2x) = (1,0) $$ so $$ C(1,0)_n = 2 T_n - U_n. $$ Now let us have a look at $C(0,1)_n$. We have $$ a_0 = 0 \\ a_1 = 1 \\ a_2 = 2x(1) - 0 = 2x $$ and the recurrence relation continues afterwards exactly like $U_n$ since it has the same initial conditions, just shifted by an index. That means $$ \forall n \ge 1, \quad C(0,1)_n = U_{n-1}, \quad C(0,1)_0 = 0. $$ Therefore, if $f, g \in \mathcal R$ are arbitrary analytic functions, you will have $$ C(f,g)_n = f C(1,0)_n + g C(0,1)_n = f(2 T_n - U_n) + g U_{n-1}. $$ For instance, if $f=1$ and $g=x$, you have $$ T_n = C(1,x)_n = (2 T_n - U_n) + x U_{n-1}. $$ (and for $n=0$, we have $T_0 = C(1,0)_0 + C(0,1)_0 = 1 + 0 = 1$; I couldn't use the above formula in the general case because $C(0,1)$ is defined separately for $n=0$).

How come is that true? Well let's put U's on one side and T's on the other: $$ U_n - x U_{n-1} = T_n $$ This is true for $n=1$: $$ U_1 - x U_0 = 2x - x(1) = x = T_1 $$ and for $n=2$: $$ U_2 - x U_1 = (4x^2-1) - x(2x) = 2x^2 - 1 = T_2 $$ and just because you might be worried about the apparition of $n-3$ in the following, it's also true for $n=3$: $$ U_3 - x U_2 = \\ (2x(4x^2-1) - 2x) - x(4x^2 - 1) \\ = 4x^3 - 3x \\ = 2x(2x^2-1) - x \\ = T_3 $$

and if we substitute in the recurrence relation: $$ U_n - x U_{n-1} \\ = (2x U_{n-1} - U_{n-2}) - x(2x U_{n-2} - U_{n-3}) \\ = 2x(U_{n-1} - x U_{n-2}) - (U_{n-2} - x U_{n-3}) \\ = 2x T_{n-1} - T_{n-2} \\ = T_n. $$ So yeah, I didn't do any sorcery apparently. Note that by re-arranging the previous equation: $$ T_n = (2 T_n - U_n) + x U_{n-1}. $$ You get $$ U_n \\ = (2T_n - T_n) + x U_{n-1} \\ = T_n + x U_{n-1} \\ = ((2 T_n - U_n) + x U_{n-1}) + x U_{n-1} \\ = (2 T_n - U_n) + 2x U_{n-1} \\ = (1) C(1,0)_n + 2x C(0,1)_n. $$ I guess in some sense $T_n$ and $U_n$ were considered fundamental to this system since $U_n = C(0,1)_{n+1}$ and $C(0,1)_0 = 0$, so people just dismissed the "zero" party of $C(0,1)$.

That's what I have to say. I guess there could be more fundamental stuff to discuss about this, but I'm out for now!

Hope that helps,