Fields with Differential identity, and Amitsur-like results

114 Views Asked by At

Let $(K,\partial)$ be an algebraically closed field of characteristic $0$ with $\partial:K\rightarrow K$ a derivation with algebraically closed field of constants $C$.

Question. Is it possible that $K$ satisfy a non-trivial one-variable differential identity (involving $+,\times,\partial$)?

Note that the answer is no for linear identities, i.e. of the for $$\sum_{i=0}^n a_i\delta^i(x)=0,$$ since the zero set of such an identity is of finite dimension over $C$.

Tries. (1) The question seems close to results like "rings with non-trivial PI-identities have finite dimension over their centre". An attempt of a proof is to consider the skew law $x*y:=\partial(x)y$ and apply Amitsur's result to the non commutative ring $(K,+,*)$, but $*$ is non-associative, neither alternative...

(2) If $K$ satisfies a nontrivial univariate differential identity, it also satisfies a nontrivial multivariate differential identies that is linear in every variable.

2

There are 2 best solutions below

0
On

Let $(R,\partial)$ be a differential domain of characteristic $0$ (i.e. a domain equiped with a derivative $\partial:R\rightarrow R$ satisfying $\partial(a+b)=\partial a+\partial b$ and $\partial(ab)=(\partial a)b+a\partial b$). Let $\bar x=(x_1,\dots,x_n)$. Let $R\{\bar x\}$ be the set of differential polynomials in variables $\bar x$, that is, the set of polynomials in variables $\{\partial^j(x_i):j\in\mathbf N,\ i=1,\dots,n\}$, $$R\{\bar x\}=R\left[ \partial^j(x_i):j\in\mathbf N,\ i=1,\dots,n\right].$$ We call the order of $\delta$ the maximum number of variables with same $x_i$ that appear multiplied together, for instance, $x_1x_2\partial^3 x_2 +x_1x_2x_3$ has order $2$. By definition, a difference polynomial is zero if all the coefficients in $R$ of its monomials are zero.

Let $R\{\bar x,1\}$ be the set $$R\{\bar x,1\}=\left\{\displaystyle\sum_{0\leqslant i_1,\dots,i_n\leqslant n} r_{i_1,\dots,i_n}\partial^{i_1}(x_1)\cdots\partial^{i_n}(x_n):r_{\bar i}\in R,\ n\in\mathbb N\right\}$$ of those differential polynomials that are $n$-linear, i.e. of order $1$. In particular, one has $$R\{x,1\}=\left\{\displaystyle\sum_{i=0}^n r_{i}\partial^{i}(x):\bar r\in R^n,\ n\in\mathbb N\right\},$$ which together with $+$ and $\circ$ is a domain. We call the degree of $\delta\in R\{x,1\}$ the maximum $n$ such that $\partial^n$ appears in $\delta$. For a differential polynomial $\delta$, we write $\delta_R$ the induced map on $R$.

Fact. If $R$ is a (possibly skew) field, then $(R\{x,1\},+,\circ)$ is left Euclidean.

Corollary. If $R$ is a left-Ore domain, then $(R\{x,1\},+,\circ)$ is a left-Ore domain too.

Proof. If $R$ is left-Ore, then $R^{-1}R$ is a skew field, hence $R^{-1}R\{x,1\}$ is left-Euclidean, hence left-Ore, so $R\{x,1\}$ is left-Ore.

Recall (?) that in a left-Ore domain $R$ (in particular in a commutative domain), there is a well-behaved notion of dimension (defined as the maximum cardinal of an $R$-independent family) for which the rank-nullity Theorem holds. In particular, if $f,g:M\rightarrow M$ are morphism of an $R$-module $M$, then $dim Ker f\circ g\leqslant dim Ker f+dim Ker g$.

Lemma. Let $(R,+,\times,\circ,\partial)$ where $\times,\circ$ are two multiplicative laws such that $(R,+\times,\partial)$ is a differential domain with constants subdomain $C=\{x\in R:\partial x=0\}$ and $(R,+,\circ)$ is a left-Ore domain. Let $\delta\in R\{x,1\}\setminus\{0\}$ of degree $n$. The zero set of $\delta_R$ in $R$ is a $C$-module of dimension at most $n$.

Proof. By induction on $n$. If $n=0$, then $\delta=r\in R\setminus\{0\}$ and has zero roots. Assume $deg \delta=n+1$ and $\delta$ has at least one nonzero root $a\in R$. Applying Euclid division, in $(Frac(R)\{x,1\},+,\circ)$, one has $\delta=\gamma\circ(\partial-a^\partial a^{-1}id)$ for some $\gamma\in Frac(R)\{x,1\}$ of degree $n$. Then $r\gamma=s$ for some nonzero $r\in R$ and $s\in R\{x\}$ of degree $n$ and one conclude by induction, since $dim Ker\delta$ cannot exceed $dim Ker s+dim Ker(\partial-a^\partial a^{-1}id)$.

Corollary. Let K be a $\partial$-field with constants subfield $C$ and $[K:C]$ infinite, and $\delta\in K\{x_1,\dots,x_n,1\}\setminus \{0\}$ a $n$-linear differential polynomial. Then $\delta$ does not vanish on $K^n$.

Proof. By induction on $n$. For $n=1$, this is the previous Lemma. For the induction step, one can view $\delta$ as a linear twist in $x_n$ over $K\{x_1,\dots,x_{n-1},1\}$, where $K\{x_1,\dots,x_{n-1},1\}$ is a $\partial$-domain with constants subfield $C$ (here is used char 0, for as @Marc Paul noticed, $\partial x^p=0$ in char p). By the above, the $x_n$-roots of $\delta$ form a $C$-vector space of finite dimension. In particular, since $[K:C]$ is infinite, there is $k\in K$ such tht $\delta(x_1,\dots,x_{n-1},k)$ is not the zero differential polynomial, so $\delta(\bar a,k)\neq 0$ for some $\bar a\in K^{n-1}$ by induction hypothesis.

Corollary. Let K be a $\partial$-field with constants subfield $C$ and $[K:C]$ infinite, and $\delta\in K\{x\}\setminus 0$ any nonzero differential polynomial. Then $\delta$ does not vanish on $K$.

Proof. If $\delta$ vanishes on $K$, then it is not linear by the above Corollary, hence $\delta(x+y)-\delta(x)-\delta(y)$ is nonzero and vanishes on $K^2$. Iterating this method, we end on a nonero $n$-linear diffferential polynomial that vanishes on $K^n$, a contradiction with the above Corollary.

0
On

In Theorem 1 of https://arxiv.org/abs/2209.13662 , we prove that, over any field of characteristic zero, if a commutative associative algebra with a derivation satisfies a differential identity, then it satisfies an identity of the form $\partial(x_1)\cdots\partial(x_m)=0$ for some $m$. Thus, if your algebra satisfying a differential identity is a domain (e.g. a field), the only way for it to happen is for $\partial$ to be identically zero.