How many functions does it take to solve polynomial equations?

204 Views Asked by At

It is well known that polynomial equations of degree $\ge 5$ cannot be solved by just using addition, multiplication, division and radicals (Abel–Ruffini theorem). So what, who cares? Let's just add extra functions until we can solve them! For example, it turns out that polynomial equations of degree 5 can be solved by adding the Bring Radical to our arsenal.

Let $k_n$ be the minimum number of univariate, continuous functions that need to be added to +,-,*,/ in order to solve polynomial equations of degree $n$.

We know that $k_0=0$, $k_1 = 0$, $k_2=1$, $k_3=2$, $k_4=2$, $k_5=4$ (not counting $\sqrt[4]{\cdot} = \sqrt{\sqrt{\cdot}}$)

Q: Is there anything of substance known about the sequence $k_n$ and the associated functions?

I suppose that due to Sprecher's variant of the Kolmogorov-Arnold Representation Theorem we have an upper bound $k_n\le 2n$ (given a monic polynomials of degree $n$ with coefficients $a_k$ define $\lambda_m(a_0,\ldots,a_{n-1}) = $"$m$"-th root and apply the theorem)

So, let's consider a family of functions ${\mathcal F = \{F_n\mid n\in \mathbb N\}}$ with ${F_n\subset C(\mathbb R)}$. Call $\mathcal F$ a minimal family iff:

  1. $F_n$ allows us to solve all polynomial equations of degree $n$.
  2. $|F_n| = k_n$ for all n

Q: Do there exist minimal families such that $F_0\subseteq F_1\subseteq F_2 \subseteq F_3 \subseteq \ldots$?