According to what rigorous definition of "polynomial" is $\sum_{k=0}^n a_k D^k$ a polynomial, where $D$ is the derivative operator?

198 Views Asked by At

Let $Df$ be the derivative of $f\in C^{\infty}$. \begin{align} D^k\colon C^{\infty}&\to C^{\infty}\\ f&\mapsto f^{(k)}=\begin{cases}f&\text{ if }k=0\\ Df^{(k-1)}&\text{ if }1\leq k \end{cases} \end{align} I was told that if $(a_0,\ldots,a_n)\in\mathbf{C}^{n+1}$, $\sum_{k=0}^n a_k D^k$ is a polynomial and can thus be factorized. That is, there is a $(b_1,\ldots,b_n)\in\mathbf{C}^n$ such that \begin{equation} \sum_{k=0}^n a_k D^k = a_n \prod_{i=1}^{n}(D^1 - b_iD^0). \end{equation}

According to what rigorous definition of "polynomial" is that true and where can I find more information?

Unfortunately, I didn't find the Wikipedia article helpful and I'm not sure if the definition as sequences in some ring is what I'm looking for, as I have no background in abstract algebra.

3

There are 3 best solutions below

0
On BEST ANSWER

As you probably know from school, polynomial functions can be factorized and that result can be used to show that any polynomial (with complex coefficients) can be factorized.

To make this precise, let's introduce some notation: For all $n\in\mathbf{N}_0$, let's define \begin{align} \tau_n\colon\mathbf{C}&\to\mathbf{C}\\ x&\mapsto x^n=\begin{cases}1&\text{if }n=0\\ x^{n-1}\cdot x&\text{else} \end{cases} \end{align} Then \begin{equation} V:=\left\{\sum_{n=0}^{\infty}a_n\tau_n:a_n=0\text{ for all almost all }n\right\} \end{equation} is the set of all polynomial functions (it's a vector space and $\{\tau_n:n\in\mathbf{N}_0\}$ is a basis). Using the fundamental theorem of algebra and the division algorithm (Theorem 17.6 in [1] or theorem 3.46 in [2]), one can show that any polynomial function can be factorized - this means that there is $(b_1,\ldots,b_n)\in\mathbf{C}$ such that \begin{equation} \sum_{k=0}^n a_k\tau_k = a_n \prod_{i=1}^{n}\tau_1 - b_i\tau_0. \end{equation} (Here are more details.)

To prove \begin{equation} \sum_{k=0}^n a_k X^k = a_n \prod_{i=1}^{n}X^1 - b_iX^0, \end{equation} we can introduce the linear map \begin{align} \phi\colon V&\to\mathbf{C}[X]\\ \sum_{k=0}^{\infty}c_k\tau_k&\mapsto\sum_{k=0}^{\infty}c_kX^k \end{align} You can easily prove that for all $A,B\in V$, $\phi(A\cdot B)=\phi(A)\cdot\phi(B)$ and therefore \begin{align} \sum_{k=0}^n a_k X^k=\phi\left(\sum_{k=0}^n a_k\tau_k\right)=\phi\left(a_n \prod_{i=1}^{n}\tau_1-b_i\tau_0\right)\\=a_n\prod_{i=1}^{n}\phi\left(\tau_1 - b_i\tau_0\right)=a_n \prod_{i=1}^{n}X^1 - b_iX^0 \end{align}

One further comment: Let $R$ be a ring. \begin{equation} R[X]:=\left\{\sum_{n=0}^{\infty}a_nX^n:a_n=0\text{ for all almost all }n\right\} \end{equation} If nothing is known about $X$, we have to define \begin{equation}\tag{1} \sum_{n=0}^{\infty}a_nX^n+\sum_{n=0}^{\infty}b_nX^n=\sum_{n=0}^{\infty}(a_n+b_n)X^n \end{equation} and \begin{equation}\tag{2} \sum_{n=0}^{\infty}a_nX^n\cdot\sum_{n=0}^{\infty}b_nX^n=\sum_{n=0}^{\infty}\left(\sum_{k+l=n}a_kb_l\right)X^n \end{equation} to make $R[X]$ a ring. But if $A$ is an algebra and $X\in A$, you can define $X^0:=1\in A$ and $X^{n+1}:=X^n\cdot X$ and prove $(1)$ and $(2)$.

In the case of my question, $R=\mathbf{C}$, $A=L\left(C^{\infty},C^{\infty}\right)$ is the set of linear maps from $C^{\infty}$ to $C^{\infty}$, $X=D$ and $\sum_{k=0}^na_kD^k\in\mathbf{C}[D]$.

[1] Thomas W. Judson. Abstract Algebra. 2019.

[2] Joseph J. Rotman. A first course in abstract algebra. 3rd edition.

9
On

Let $P = \sum_k a_k X^k \in \mathbb{C}[X]$. The sum $P(D) := \sum_k a_k D^k$ is an endomorphism of the vector space $C^{\infty}$ (see the theory of endomorphism polynomials).

One has, when $E$ is a vector space over the field $\mathbb{K}$, $R, Q \in \mathbb{K}[X]$ are polynomials and $u$ an endomorphism, that: $R(u) \circ Q(u) = (RQ)(u)$.

Over $\mathbb{C}$, the fundamental theorem of algebra gievs us the existence of coefficients $b_1, \dots, b_n$ such that $P = a_n \prod (X - b_k)$.

Thus: $P(D) = a_n \prod \left( (X - b_k)(D) \right) = a_n \prod \left( D - b_k Id_{C^{\infty}} \right)$.

6
On

The polynomial ring $\mathbb C[X]$ satisfies the following universal property: If $\iota:\mathbb C\longrightarrow S$ is a commutative ring extension of $\mathbb C$ (so $S$ is commutative and $\iota$ is an injective ring homomorphism) and $s\in S$, then there is exactly one homomorphism $e:\mathbb C[X]\longrightarrow S$ such that $e(X)=s$ and $e(z)=\iota(z)$ for all $z\in\mathbb C$. We call $e$ an evaluation homomorphism, since it evaluates each polynomial at $s$ by plugging in $s$ for $X$. You can find this property on Wikipedia, or really any decent book on abstract algebra. Look for universal property of polynomial rings.

Now here the ring $S$ is the ring of differential operators of the form $\sum_{k=0}^n a_k\mathrm D^k$, and we have $\iota:z\mapsto z\mathrm D^0$. Also, we choose $s=\mathrm D$. We notice that the operator $\sum_{k=0}^n a_k\mathrm D^k$ is exactly the image of the polynomial $\sum_{k=0}^n a_k X^k$ under the evaluation homomorphism. And since the polynomial can be factored, and homomorphisms are multiplicative, its image can be factored in exactly the same way.