Commutator of congruences of a ring

99 Views Asked by At

I am trying to prove the following fact that should be mostly trivial to see:

If $\alpha, \beta$ are congruences of a ring $R$, corresponding to ideals $I$ and $J$ respectively, then the commutator $[\alpha,\beta]$ is the congruence corresponding to the ideal $IJ + JI$.

I can't see why $\alpha$ centralizes $\beta$ over the congruence corresponding to this ideal and much less why it's a minimal such congruence.

My attempt up to now: Let $t$ be a term over $R$, i.e a polynomial in some number of variables. Then if we have $x,y \in R$ such that $x \alpha y$ and $w,v \in R^l$ s.t $w_i \beta v_i$ $\forall i\le l$ then we need to show that for every term $t$ in $R$ we have: $t(x,v) + (IJ+JI) = t(x,w) + (IJ+JI) \iff t(y,v) + (IJ+JI) = t(y,w) + (IJ+JI)$.

I know that $x + I = y+ I$ and a similar thing holds for $v$ and $w$. My intuition is that because $t$ is a polynomial, i.e sum of monomials I can iteratively say things for monomials based on the variables they start with. If they start with $x$ or $y$ then we can remove these modulo $I$ but I don't know how to formalize it. In general commutator theory is confusing me so any help would be great.

1

There are 1 best solutions below

0
On BEST ANSWER

The statement that $\alpha$ centralizes $\beta$ modulo $\gamma$ for $\alpha, \beta,\gamma\in \textrm{Con}(R)$ means that

$ t(a,c)\equiv_{\gamma}t(a,d)\Rightarrow t(b,c)\equiv_{\gamma}t(b,d) $

whenever $t(x,y)$ is an $(m+n)$-ary polynomial operation of $R$, the tuples $a, b\in R^m$ satisfy $a\equiv_{\alpha}b$, and the tuples $c, d\in R^n$ satisfy $c\equiv_{\beta}d$. To verify that this implication holds for some ring, one may replace the polynomial $t(x,y)$ with the translate $s(x,y)=t(x+a,y+c)-t(a,c)$ and replace the tuples $a, b, c,d$ with the translates $0=a-a, p = b-a, 0 = c-c, q = d-c$. Then the centralization implication becomes

$ s(0,0)\equiv_{\gamma}s(0,q)\Rightarrow s(p,0)\equiv_{\gamma}s(p,q) $

and moreover we have that $s(0,0)=t(0+a,0+c)-t(a,c)=0$. No generality is lost by this translation, since it is possible to reverse the procedure. (That is, a ring satisfies the first universally quantified implication iff it satisfies the second.) What is gained is that it becomes easier to discuss the centralization implication in terms of ideals if we use the translated form. (When $I$ is the ideal associated to $\alpha$, $a\equiv_{\alpha} b$ holds iff $0\equiv_{\alpha} b-a=:p$ holds iff $p\in I^m$.) Therefore, it makes sense to use the phrase $I$ centralizes $J$ modulo $K$ by using the translated form.

Given this translation, let me restate the condition that $\alpha$ centralizes $\beta$ modulo $\gamma$ in terms of the associated ideals $I, J, K$. Say that $I$ centralizes $J$ modulo $K$ iff $ s(0,0)\equiv_{K}s(0,q)\Rightarrow s(p,0)\equiv_{K}s(p,q) $ whenever $s(x,y)$ is an $(m+n)$-ary polynomial operation satisfying $s(0,0)=0$, $p\in I^m$, $q\in J^n$. Using the assumption that $s(0,0)=0$ we may express this even more ring-theoretically as: $s(0,q)\in K$ implies that $s(p,q)-s(p,0)\in K$.

Give this translation, let the discussion begin here. Suppose that $I$ centralizes $J$ modulo $K$. Let $s(x,y)=x\cdot y$. Observe that $s(0,0)=0\cdot 0=0$ for this polynomial, as required. Choose any $p\in I, q\in J$ and apply the centralization implication: we have $s(0,q)=0\in K$ so we must also have $s(p,q)-s(p,0)=s(p,q)=pq\in K$. Similarly, using the polynomial $s'(x,y)=y\cdot x$ instead, we get that $qp\in K$. In conclusion, we have that if $I$ centralizes $J$ modulo $K$, then $K$ must contain all products $pq, qp$ with $p\in I$ and $q\in J$, and therefore we must have $K\supseteq IJ+JI$.

Conversely, if $K\supseteq IJ+JI$, then we will see that $I$ centralizes $J$ modulo $K$. Let $s(x,y)$ be an arbitrary $(m+n)$-ary polynomial satisfying $s(0,0)=0$. The last condition means that the constant term of $s$ is $0$. Collect the monomials of $s$ that have $x$'s but no $y$'s into $u(x)$, the monomials of $s$ that have $y$'s but no $x$'s into $v(y)$, and the monomials of $s$ that have both $x$'s and $y$'s into $w(x,y)$. We may write $s(x,y)=u(x)+v(y)+w(x,y)$ where each of $u, v, w$ has zero constant term. Observe that $u(0)=v(0)=0=w(0,y)=w(x,0)$ for any $x, y\in R$. Now choose any $p\in I^m$ and $q\in J^n$. If the premise of the centralization implication holds, $s(0,q)\in K$, then we must have $v(q)=s(0,q)\in K$. The conclusion of the centralization implication is that $s(p,q)-s(p,0)\in K$. We must prove that this holds if the premise holds (i.e. if $v(q)\in K$). Observe that $s(p,q)-s(p,0)$ $=(u(p)+v(q)+w(p,q))-(u(p)+v(0)+w(p,0))$ $=v(q)+w(p,q)$. Thus, we must prove that $v(q)\in K$ implies $v(q)+w(p,q)\in K$. This is equivalent to proving that $v(q)\in K$ implies $w(p,q)\in K$.

In fact, $w(p,q)\in K$ whether or not $v(q)\in K$. If $w_i(x,y)$ is a single monomial of $w(x,y)$, then both $x$'s and $y$'s appear in $w_i(x,y)$. If the leftmost variable to appear in $w_i(x,y)$ is an $x$, then $w_i(p,q)\in IJ$. If the leftmost veriable in $w_i(x,y)$ is a $y$, then $w_i(p,q)\in JI$. The sum $w(p,q)$ of all these values belongs to $IJ+JI\subseteq K$, so $w(p,q)\in K$.

I close by remarking that this argument does not use the associativity of $R$, but it does use the fact that multiplication distributes over addition. Thus, the same result holds for not-necessarily-associative rings and $k$-algebras.