If $A$ is a real symmetric matrix with full rank and only simple roots, what can be said about $|B\otimes I+I\otimes B|$, for $B = \sum_k a_k A^k$?

69 Views Asked by At

I'm considering a situation in which $A, B > 0$ are real $(p\times p)$ symmetric matrices where $A$ has $p$ distinct eigenvalues $\{\lambda_i\}_{i=1}^p$, and $AB=BA$. I'm interested in the level of constraint placed on the matrix $B$ given these assumptions.

For example, according to the Wikipedia on Commuting Matrices, the minimal polynomial of $A$ coincides with it's characteristic polynomial, so $$B = \sum_{k=0}^{p-1} \alpha_k A^k.$$

While I would be glad to hear more generally about the constraints associated with the matrix $B$, my particular question is with the determinant (denoted by $|\cdot|$) of the Kronecker Sum (denoted by $\oplus$):

\begin{equation} \begin{aligned} |B\oplus B| = |B\otimes I + I \otimes B| &\propto \prod_{i \leq j} (\delta_i + \delta_j)\\ &= \prod_{i\leq j} \left(\sum_k \alpha_k (\lambda_i^k + \lambda_j^k)\right), \end{aligned} \end{equation} where $\propto$ represents a proportionality relation with respect to the arguments of interest, $\otimes$ represents the Kronecker Product, and $\{\delta_i\}_{i=1}^p$ are the eigenvalues of $B$.

The question is: For $\vec{\alpha} = (\alpha_0,\ldots,\alpha_{p-1}) \in \mathbb{R}^p$ and $\vec{\lambda} = (\lambda_1,\ldots,\lambda_{p}) \in \mathbb{R}^p$, do there exist functions $h(\vec{\alpha})$ and $g(\vec{\lambda})$ such that

$$|B\oplus B| = h(\vec{\alpha}) + g(\vec{\lambda})?$$

$$|B\oplus B| = h(\vec{\alpha}) g(\vec{\lambda})?$$

I am also interested in decoupling $\vec{\lambda}$ from any form of "additional information" given by the polynomial. For example, if $B$ is expressed in a factored form:

$$B = \prod_{l=1}^{L} (A - c_lI)^{n_l},$$

or via Cayley-Hamilton:

$$|B\oplus B| \propto \prod_{i\leq j}\left( \prod_{l=1}^L (\lambda_i - c_l)^{n_l} + \prod_{l=1}^L (\lambda_j - c_l)^{n_l} \right)$$

do such functions exist for $\vec{c}$ and $\vec{\lambda}$, respectively?

Thanks for your consideration.

Update:

I have worked some amount further on the problem. I use the shorthand $p_{kij}$ to denote the power sum symmetric polynomial $p_k(\lambda_i,\lambda_j)$ and rewrite and expand the determinant above.

\begin{equation} \begin{aligned} \prod_{i\leq j}\left(\sum_{k} \alpha_k(\lambda_i^k + \lambda_j^k) \right) &= \prod_{i\leq j}\left(\sum_{k} \alpha_kp_{kij}\right). \end{aligned} \end{equation}

I note if all $p_{kij} = 1$, the expansion is the complete symmetric homogeneous polynomial $h_k(\alpha_0,\ldots,\alpha_{p-1})$. Even when the $p_{kij}$ differ the power expansion of the $\alpha_k$ remain the same. I consider a vector $h(\vec{\alpha})$ with elements according to the terms of $h_k(\alpha_0,\ldots,\alpha_{p-1})$. Then

\begin{equation} \begin{aligned} \prod_{i\leq j}\left(\sum_{k} \alpha_kp_{kij}\right) &= h(\vec{\alpha})^\top g(\vec{\lambda}), \end{aligned} \end{equation}

where $g(\vec{\lambda})$ has elements made up of combinations of $p_{kij}$ mapped from $\lambda$. The geometric definition of the dot product still suggests dependence via

$$\vec{h}^\top\vec{g} = \|\vec{h}\|\|\vec{g}\|\cos(\theta),$$

so I'm unsure one can eliminate dependence without an extreme assumption such as orthogonality. I have re-tagged this to include symmetric polynomials in case I can pick anyone's brain from that space.