In the paper A Formula for the Determinant of a Sum of Matrices by Reutenauer and Schützenberger, the authors show a way of computing the coefficients of the characteristic polynomial of $X_1 + X_2 + \ldots + X_k$ in terms of the coefficients of the characteristic polynomials of the individual matrices $X_1, X_2, \ldots, ... X_k$. (Here all matrices are square and have the same size.)
The proof involves the following identity that I am trying to understand. $$ \det \left( 1 - X_1 - \ldots - X_k \right) = \prod_l \det \left( 1 - l \right), $$ where the product runs over all Lyndon words $l$ in the alphabet $\{X_1, \ldots X_k\}$. In particular, the right-hand side is an infinite product. The proof involves the algebra of power-series. (See here a related question about that proof.)
I am trying to understand how this is even possible. On the left-hand side of the equation, we have a polynomial, with a finite set of roots. On the right-hand side we have the product of infinitely many polynomials, and thus infinitely many roots (in general).
I have tried understanding this formula using a limit of the right-hand side. Define $p_n = \prod_{l \in L_n} \det \left( 1 - l \right)$, where $L_n$ is the set of Lyndon words of up to length $n$. Then, we can write $$ \det \left( 1 - X_1 - \ldots - X_k \right) = \lim_{n \to \infty} p_n. $$ However, I don't think this interpretation is correct because of the following. Take an arbitrary Lyndon word $l_0$ and let $z$ be a root of $\det (1 - l_0)$. Then, $p_n(z) = 0$ for all $n$ sufficiently large. This would imply that each of the roots of each of the polynomials in the product are roots of $\det \left( 1 - X_1 - \ldots - X_k \right)$.
The only way that I can make sense of all of the above is that one of the following must be true:
- Each of the roots of $\det \left( 1 - X_1 - \ldots - X_k \right)$ appears infinitely many times as a root of the polynomials $\det\left( 1 - l \right)$. This seems highly non-trivial and I cannot find a proof of it.
- I am using the wrong definition of limit. Perhaps I am using the wrong topology in the ring of power series.
Any help in making sense of this formula is appreciated. Thanks!
I think that intuition is not a good guide when dealing with infinite products, even infinite products of polynomials. Just because $\alpha$ is a root of $p_0(X)$ this does not mean that $\alpha$ is a zero of $\prod_{n=0}^\infty p_n(X)$.
So (I use your comment) let us look at $$ \det (1-t(X_1+\dots+X_n)=\prod_{\ell} \det(1- t^{w(\ell)}\ell)\tag{*} $$ where the product runs over all Lyndon words, and $w(\ell)$ is the total degree of $\ell$.
Let us look at what your argument is saying in the simplest non-trivial case, the one where $n=2$, where the dimension of the underlying space is $1$, and where $X_1=X_2=1$. That is, we are looking at $$ 1-2t=\prod_{\ell}(1-t^{w(\ell)}).\tag{**} $$.
This really is true: let's verify the first few terms. The Lyndon words on $X,Y$ are $X,Y,XY, X^2Y,XY^2,\dots$ so the RHS of $(**)$ is $$ (1-t)(1-t)(1-t^2)(1-t^3)(1-t^3) $$ to degree 3; simplifying $$ 1-2t+O(t^4). $$
However it is surely the case that $t=1$ is a root of every "factor" on the RHS, whereas the only root of the LHS is $t=\frac{1}{2}$.
Your related question here (which is answered by the uniqueness of the standard factorisation of a monomial into Lyndon words) allows us by taking inverses to prove $(*)$ without the determinants. Using the fact that $\det$ is multiplicative, and that it is continuous in the $t$-adic topology we will get a proof of $(*)$.