Let $Df$ be the derivative of $f\in C^{\infty}$. \begin{align} D^k\colon C^{\infty}&\to C^{\infty}\\ f&\mapsto f^{(k)}=\begin{cases}f&\text{ if }k=0\\ Df^{(k-1)}&\text{ if }1\leq k \end{cases} \end{align} I was told that if $(a_0,\ldots,a_n)\in\mathbf{C}^{n+1}$, $\sum_{k=0}^n a_k D^k$ is a polynomial and can thus be factorized. That is, there is a $(b_1,\ldots,b_n)\in\mathbf{C}^n$ such that \begin{equation} \sum_{k=0}^n a_k D^k = a_n \prod_{i=1}^{n}(D^1 - b_iD^0). \end{equation}
According to what rigorous definition of "polynomial" is that true and where can I find more information?
Unfortunately, I didn't find the Wikipedia article helpful and I'm not sure if the definition as sequences in some ring is what I'm looking for, as I have no background in abstract algebra.
As you probably know from school, polynomial functions can be factorized and that result can be used to show that any polynomial (with complex coefficients) can be factorized.
To make this precise, let's introduce some notation: For all $n\in\mathbf{N}_0$, let's define \begin{align} \tau_n\colon\mathbf{C}&\to\mathbf{C}\\ x&\mapsto x^n=\begin{cases}1&\text{if }n=0\\ x^{n-1}\cdot x&\text{else} \end{cases} \end{align} Then \begin{equation} V:=\left\{\sum_{n=0}^{\infty}a_n\tau_n:a_n=0\text{ for all almost all }n\right\} \end{equation} is the set of all polynomial functions (it's a vector space and $\{\tau_n:n\in\mathbf{N}_0\}$ is a basis). Using the fundamental theorem of algebra and the division algorithm (Theorem 17.6 in [1] or theorem 3.46 in [2]), one can show that any polynomial function can be factorized - this means that there is $(b_1,\ldots,b_n)\in\mathbf{C}$ such that \begin{equation} \sum_{k=0}^n a_k\tau_k = a_n \prod_{i=1}^{n}\tau_1 - b_i\tau_0. \end{equation} (Here are more details.)
To prove \begin{equation} \sum_{k=0}^n a_k X^k = a_n \prod_{i=1}^{n}X^1 - b_iX^0, \end{equation} we can introduce the linear map \begin{align} \phi\colon V&\to\mathbf{C}[X]\\ \sum_{k=0}^{\infty}c_k\tau_k&\mapsto\sum_{k=0}^{\infty}c_kX^k \end{align} You can easily prove that for all $A,B\in V$, $\phi(A\cdot B)=\phi(A)\cdot\phi(B)$ and therefore \begin{align} \sum_{k=0}^n a_k X^k=\phi\left(\sum_{k=0}^n a_k\tau_k\right)=\phi\left(a_n \prod_{i=1}^{n}\tau_1-b_i\tau_0\right)\\=a_n\prod_{i=1}^{n}\phi\left(\tau_1 - b_i\tau_0\right)=a_n \prod_{i=1}^{n}X^1 - b_iX^0 \end{align}
One further comment: Let $R$ be a ring. \begin{equation} R[X]:=\left\{\sum_{n=0}^{\infty}a_nX^n:a_n=0\text{ for all almost all }n\right\} \end{equation} If nothing is known about $X$, we have to define \begin{equation}\tag{1} \sum_{n=0}^{\infty}a_nX^n+\sum_{n=0}^{\infty}b_nX^n=\sum_{n=0}^{\infty}(a_n+b_n)X^n \end{equation} and \begin{equation}\tag{2} \sum_{n=0}^{\infty}a_nX^n\cdot\sum_{n=0}^{\infty}b_nX^n=\sum_{n=0}^{\infty}\left(\sum_{k+l=n}a_kb_l\right)X^n \end{equation} to make $R[X]$ a ring. But if $A$ is an algebra and $X\in A$, you can define $X^0:=1\in A$ and $X^{n+1}:=X^n\cdot X$ and prove $(1)$ and $(2)$.
[1] Thomas W. Judson. Abstract Algebra. 2019.
[2] Joseph J. Rotman. A first course in abstract algebra. 3rd edition.