If a subring $S$ of $R=M_2(\Bbb Q)$ is an integral domain and contains the center of $R$ then $S$ must be a field

319 Views Asked by At

Consider the ring $R=M_2(\Bbb Q)$ of $2\times 2$ matrices with entries in $\Bbb Q$. Suppose $S$ is a subring of $R$ containing the center of $R$, and suppose $S$ is an integral domain. I am trying to show that $S$ must be a field.

It is well-known that the center $Z$ of $R$ consists of scalar matrices, so $Z \cong \Bbb Q$. Thus if I can show that I must have $S=Z$, then I will be done, but I can't see whether this is true. Any hints? Thanks in advance.

3

There are 3 best solutions below

0
On BEST ANSWER

If $S=Z(M_2(\Bbb Q))=\Bbb Q$ there is nothing to prove.

So assume that there exists $x\in S\setminus\Bbb Q$. Since $x$ is a $2\times2$ matrix there must be a relation $$ x^2+ax+b=0\qquad a,b\in\Bbb Q. $$ Note that we must have $b=\det(x)\neq0$, or else $x(x+a)=0$ making $x$ a $0$-divisor while $S$ is a domain by hypotheses. Thus $$ \frac1x=-\frac ab-\frac1bx\in S, $$ i.e. $S$ contains the quadratic field $F=\Bbb Q(x)$.

It is a well-known fact that given a quadratic field $F\subset M_2(\Bbb Q)$ there exists an element $u\in M_2(\Bbb Q)$ with $u^2\in\Bbb Q$ such that $$ M_2(\Bbb Q)=F\oplus Fu\quad\text{and}\quad u\lambda=\bar\lambda u $$ where bar denotes the non-trivial automorphism of $F$ (this is a much more general fact known as the Skolem-Noether theorem). Then one sees that for all $\lambda$, $\mu\in F$ one has $$ \det(\lambda+\mu u)=\lambda\bar\lambda-\mu\bar\mu u^2. $$ Matrices with determinant $0$ do exist, thus there exist $x$, $y\in F$ such that $$ u^2=\frac{y\bar y}{x\bar x}. $$ Now let $z=a+bu\in S$ with $b\neq0$. If $\det(z)=0$ we get a contradiction as above since $z$ is a $0$-divisor in $S$. But if $\det(z)\neq0$, let $$ z^\prime=(yb-ax)+xz\in S. $$ It is straightforward to check that $\det(z^\prime)=0$ and this is again a contradiction.

Therefore $S=F$.

0
On

Hints.

  1. Use Cayley-Hamilton theorem and the fact that $S$ is an integral domain to show that the only singular matrix in $S$ is the zero matrix.
  2. If $A\in S$ is nonsingular, use Cayley-Hamilton theorem to express $A^{-1}$ as a polynomial in $A$. Hence show that $A^{-1}$ is also a member of $S$.

The fact that every scalar matrix is a member of $S$ is essential in proving both statements in the above. Once statements 1 and 2 are proved, $S$ is an integral domain in which every nonzero member has an inverse. Hence $S$ is a field.

1
On

Basically, your assumption means that $S$ is a $\mathbb{Q}$-subalgebra of $R$ which is an integral domain. Since $R$ is finite dimensional over $\mathbb{Q}$, so is $S$.

But we have the following theorem:

Thm. Let $K$ be a field, and let $A$ be a finite dimensional commutative (unital associative and commutative) $K$-algebra. If $A$ is an integral domain, then $A$ is a field.

Proof. Let $a\in A\setminus\{0\}$, and consider $\ell_a: x\in A\mapsto as\in A$. Since $A$ is a $K$-algebra, this map is $K$-linear endomorphism of the $K$-vector space $A$ . Since $a\neq 0$ and $A$ is an integral domain, $\ell_a$ is injective. Since $A$ is finite dimensional, $\ell_a$ is also surjective. IN particular, there eixsts $a'\in A$ such that $a'a=1$, hence $a$ is invertible and $A$ is a field.