The invertibility of matrix in Banach algebra

82 Views Asked by At

Let $\mathcal{R}$ be a Banach algebra with identity element and let $a = (a_{jk})^n_{j,k=1} \in \mathcal{R}^{n \times n}$ be a matrix whose entries $a_{jk} \in \mathcal{R} $ commute pairwise. Then the fact that $a$ is invertible implies that $\text{det} \ a \in \mathcal{R} $ is invertible?

The proof is as below:

Now suppose $a$ is invertible in $\mathcal{R}^{n\times n }$, and $a^{-1} = c = (c_{jk})^n_{j,k=1} \in \mathcal{R}^{n \times n} $. It suffices to prove that the elements $c_{jk}$ commute pairwise and that they commute with all $a_{jk}$ elements, since then the identity $e = \text{det} (a^{-1} a ) = \det \ a^{-1} \det a$.

Let $\mathcal{\zeta} $ denote the set of all commutative subalgebras of the algebra $ \mathcal{R} $ containing all entries $ a_{jk} $ of matrix $ a $. The set $\mathcal{\zeta} $ is partial ordered by inclusion. Thus, by Zorn lemma, $\mathcal{\zeta} $ contains at least one maximal element $ \mathcal{U} $. The commutative subalgebra $\mathcal{U} \subseteq \mathcal{R} $ then poessess the following property: if $ a \in \mathcal{R} $ and if $ax = xa $ for all $ x \in \mathcal{U} $ then $a \in \mathcal{U} $. Since $a_{jk} \in \mathcal{U} $, for every $x \in \mathcal{U} $ the equalities $ a^{-1} x = a^{-1} x a a^{-1} = a^{-1} x = a^{-1} a x a^{-1} = x a^{-1} $ hold and , hence, $xc_{jk} = c_{jk} x , (j,k = 1, \cdots, n)$ for every $ x \in \mathcal{U} $. This implies that $ c_{jk} \in \mathcal{U} $.

Note: It is my doubt that, why $c_{jk}$ satisfies the equation in bold part? without the invertibility of $\text{det} \ a \in \mathcal{R} $, how can we figure out that $c_{jk}$ can be represented as the finite sum of composition of $a_{jk},j,k = 1,\cdots, n $ and its inverses, like the form in linear algebra?

Note: This is the step 2 of [MikhlinProssdorf, Singular Integral Operator, P114, Lemma 1.1].

2

There are 2 best solutions below

4
On BEST ANSWER

The equalities $xc_{jk}=c_{jk}x$ follow immediately from the previous sentence in which it is shown that $a^{-1}x=xa^{-1}$. The $(j,k)$ entry of $a^{-1}x$ is $c_{jk}x$ and the $(j,k)$ entry of $xa^{-1}$ is $xc_{jk}$.

(Incidentally, the use of Zorn's lemma in the proof is rather ridiculous overkill. You can just directly use this argument with $x=a_{jk}$ to show the entries of $a$ commute with the entries of $a^{-1}$, and then use it again with $x=c_{jk}$ to show the entries of $a^{-1}$ commute with each other.)

4
On

This is indeed much more subtle than I originally have naively thought. In the current statement of the problem, it's confusing to have $a\in \mathcal R^{n\times n}$ and $a\in\mathcal R$ in different places. Anyway, here is how I solve it.

Claim: If $x\in\mathcal R$ commutes with all entries $a_{ij}$ of $a$, then it also commutes with all entries $c_{ij}$ of $a^{-1}$. In particular, all entries of $a$ commutes with $c_{ij}$'s.

Proof. Let $X=\text{diag}(x, \cdots, x)\in\mathcal R^{n\times n}$, note that $Xa^{-1}a = X = a^{-1}aX = a^{-1} X a$, we get $Xa^{-1}a = a^{-1} Xa$, and mutiply by $a$ on the right, there is $Xa^{-1} = a^{-1}X$. Now it's straightforward to check $xc_{ij} = c_{ij}x$. (We have tried to avoid the scalar multiplication notation for matrices, as the scalar might not commute with the entries.)

In particular, since each $c_{ij}$ commutes with all $a_{ij}$'s, it must commutes with all of the entries of $a^{-1}$, that is $c_{ij}$'s commute with each other (this is the part I struggled with). Therefore $a_{ij}, c_{ij}$ form a pairwise commutative familly of elements. Now we may use $\det(ab)=\det(a)\det(b)$ in a commutative ring to deduce that $\det(a)$ is invertible in $\mathcal R$.

The subtlety here in my opinion is the following. For example, we could have a matrix $a\in \mathbb Z^{n\times n}$ which is not invertible, but became invertible over $\mathbb Q$, only because $\mathbb Q$ contains $\det(a)^{-1}$. This is well-understood in the context of commutative ring extension. However, say we have a unital ring extension $R\subset S$ where $R$ is commutative and $S$ is not. Then $a\in R^{n\times n}$ is invertible in $S^{n\times n}$ doesn't a priori mean that $S$ contains $\det(a)^{-1}$, and $a^{-1} = \det(a)^{-1}adj(a)$, where $adj(a)$ is the adjugate matrix of $a$. The above argument shows that it's actually indeed the case.

BTW, this turns out to be purely algebraic as expected, which has little to do with norm and analysis.