What is minimal polynomial over a skew field?

155 Views Asked by At

I am reading "Polynomial extensions of skew fields" by J. Treur (see here.) What does he mean by "minimal polynomial $p$ of $\theta$ over skew field $K$"? Can we define the minimal polynomial if $K$ is not commutative? If yes, what are its properties? Is it still true that if $g$ is a polynomial over $K$ such that $g(\theta)=0$, then $p$ divides $g$?

Also, what is a left or right basis $B$ for $V$? My guess is that it is left or right linearly independent set, i.e., $\sum_{j=0}^n a_jb_j=0, b_j\in B$ (and the other form corresponding to "right") if and only if $a_j=0$ for all $j$ and also every element $x\in V$ can be written as $$x=\sum_{j=0}^n a_jb_j$$ (and the other form corresponding to "right").

1

There are 1 best solutions below

3
On BEST ANSWER

I think it is necessary (though not sufficient) to be acquainted with linear algebra over skew fields (also known as division algebras) in order to understand what this paper is saying. If $K$ is a skew field, much of basic linear algebra can be applied to define vector spaces over$~K$, provided one pays more attention than usual to order of operands, and distinguishes left $K$-vector spaces and right vector-$K$ spaces (putting $K$ to the right of the word vector is my invention; don't expect to find this in the literature).

In a left $K$-vector space $V$ one has, apart from the usual addition of vectors, a scalar multiplication $K\times V\to V$, which, with scalars written to the left of vectors satisfies an "associative" law $\lambda(\mu v)=(\lambda\mu)v$ where the Greeks are in $K$ and the Romans in $V$. An additive map $f:V\to W$ between left $K$-vector spaces is $K$-linear if $f(\lambda v)=\lambda f(v)$ for all $\lambda\in K,v\in V$, no sweat so far. What is a bit troublesome is that scalar multiplication itself does not give a $K$-linear map $V\to V$ (unless the multiplication is by a scalar from the centre of$~K$). This is even true for $K=K^1$, or more generally $K^n$, viewed as left $K$-vector spaces in the natural way (scalar multiplication being left-multiplication on each entry). By contrast, right-multiplication on all entries does define a $K$-linear map (since multiplication in $K$ is associative even if its is not commutative), and this means that $K$-linear maps $K^n\to K^m$ can only be given by right-multiplication by matrices, if one wishes to preserve the usual rules of matrix multiplication without operand-swapping (and ones sanity). Also one is obliged to represent elements of $K^n$ as row vectors to make this work, as a small example should convince you: for the $K$-linear map $f:K^2\to K^3$ with $f((1,0))=(a,b,c)$ and $f((0,1))=(d,e,f)$ one has from the definition of $K$-linearity that $f((x,y))=(xa+yd,xb+ye,xc+yf)$ which corresponds to the vector-matrix multiplication $$ \pmatrix{x&y}\pmatrix{a&b&c\cr d&e&f}=\pmatrix{xa+yd&xb+ye&xc+yf}.$$ For right vector-$K$ spaces one has a scalar multiplication $V\times K\to V$ satisfying $(v\lambda)\mu=v(\lambda\mu)$, linear-$K$ maps satisfy $f(v\lambda)=f(v)\lambda)$, the set $K^n$ is viewed as column vectors, with left-multiplication by an $m\times n$-matrix giving a linear-$K$ map $K^n\to K^m$ (order of $m,n$ switched), as usual.

It is important to note that given a left $K$-vector space structure on $V$ one cannot in general derive a right vector-$K$ vector space structure on $V$ from it, and that although $K^n$ has both a natural left $K$-vector space structure and a natural right vector-$K$ vector space structure, the two are different in an essential way.

When, as in the paper, one considers extension $L/K$ of skew fields, then $L$ has both the structure of a left $K$-vector space and of a right vector-$K$ vector space, but again the two differ. Therefore, to employ the language of linear algebra (like bases, linear relations) one has to specify which of the two structures one is referring to; I assume (or hope) that the prefixes left- and right- systematically match the structure being referred to. Then $\theta\in L$ generates $L$ as a left-polynomial extension of $K$ if for some $n$ the first $n$ powers of $\theta$ in $L$ form a bases of $L$ as left $K$-vector space. Then expressing $\theta^n$ on this basis $\theta^n=\lambda_0+\lambda_1\theta+\cdots+\lambda_{n-1}\theta^{n-1}$ one gets the left-minimal polynomial $\lambda_0X^0+\lambda_1 X+\cdots+\lambda_{n-1}X^{n-1}$ of $\theta$ over $K$.

Note that the polynomial ring $K[X]$ can be defined as usual (no operand-swapping), which makes $X$ and its powers central elements, and with coefficients written to the left of powers of $X$, it has a natural left $K$-vector space structure. However, substitution of a non-central element $a\in L$ for $X$ defines a map $K[X]\to L$ that is $K$-linear (by definition it is the $K$-linear map sending the basis elements $X^n$ of $K[X]$ to $a^n\in L$) that I will denote $P\mapsto P[a]$, but this map does not respect multiplication ($(PQ)[a]$ and $P[a]Q[a]$ differ in general), and therefore is not a morphism of rings (at it maps the central element $X$ to the non-central $a$, it cannot be so).

Now to answer your questions. Suppose $P\in K[X]$ is any annihilating polynomial for$~\theta$: one has $P[\theta]=0$ in $L$. Then right-multiplying that relation by $\theta^k$ gives that $(X^kP)[\theta]=0$ for any $k\in\Bbb N$ (remember that $X$ is central in $K[X]$) and taking (left) $K$-linear combinations one finds that $(QP)[\theta]=0$ for all $Q\in K[X]$ (note that this works only for left-multiples of $P$). Also in $K[X]$ one can do Euclidean right-division by (for instance) the minimal polynomial$~\mu$: for any $P\in K[Q]$ there are $Q,R$ in $K[X]$ with $\deg(R)<\deg(\mu)$ such that $P=Q\mu+R$. Then since by the above $(Q\mu)[\theta]=0$ one concludes that $P[\theta]=0$ implies $R[\theta]=0$, which by degree and minimality of $\mu$ implies $R=0$ so $P=Q\mu$. So indeed, any annihilator polynomial of $\theta$ is a left-multiple of its minimal polynomial, or equivalently $\mu$ right-divides $P$. But these statements need not hold with left and right interchanged.

Your final paragraph seems correct.