A question about equivalence of matices with prescribed off-diagonal elements

37 Views Asked by At

I am reading a paper Matrices with prescribed off-diagonal elements and have a problem.

Let $A=(a_{ij})_{n\times n}\in M_n(\mathbb{K})$ and $\mathbb{K}$ is an algebraically closed field.

Suppose that the $n^2- n$ off-diagonal elements are prescribed. Let $x_i$ = $a_{ii}~$, $i = 1,\cdots, n$ and denote the $n$ elementary symmetric polynomials of $x_1, ...,x_n$ by $\sigma_1,\cdots,\sigma_n$.

Let $\lambda_1,\lambda_2,\cdots,\lambda_n$ be prescribed eigenvalues of $A$. The author says the existence of such a matirx $A$ is equivalent to the solvability of n polynomial equations: $$\sigma_i(x_1,\cdots,x_n)+R_i(x_1,\cdots,x_n)=\sigma_i(\lambda_1,\cdots,\lambda_n),~~~~~~i=1,\cdots,n(*)$$

where the degree of each $R_i(x_1,\cdots,x_n)$ is smaller than $i$ (in fact for $i \ge 2$ the degree of each $R_i$ is not greater than $i-2$).

I understand that we want to find a matrix $A$ for which the off-diagonal elements have already been determined, and we want to determine the $n$ diagonal elements of $A$ such that the eigenvalues of $A$ are $\lambda_1,\cdots,\lambda_n$.

But I want to know how to understand the equivalence with the existence of $A$ and the solvability of n polynomial equations$(*)$.

I wish I could get some help.Thanks!

1

There are 1 best solutions below

1
On BEST ANSWER

This can be seen by computing the characteristic polynomial $p(x)$ of $A$ in two different ways. On the one hand, since the eigenvalues of $A$ are the roots of $p(x)$, we know that $p(x) = \prod_k (x - \lambda_k)$. Hence the coefficient of $x^{n-i}$ in $p(x)$ is $(-1)^i \sigma_i(\lambda_1,\ldots,\lambda_n)$. On the other hand, it is well-known that the coefficient of $x^{n-i}$ in $p(x)$ is $(-1)^i E_{i}$, where $E_i$ denotes the sum of all determinants of $i \times i$ principal submatrices of $A$ (i.e. the sum of all $i \times i$ principal minors). Let's compute $E_i$ explicitly.

Consider an arbitrary principal $i \times i$ submatrix of $A$, say the rows and columns are indexed by $j_1 < \cdots < j_i$. The determinant of this submatrix is $$\sum_{\sigma \in S_i} \text{sgn}(\sigma) \prod_r a_{j_r,j_{\sigma(r)}} = \prod_{r} x_{j_r} + \sum_{\sigma \neq \text{id}} \text{sgn}(\sigma) \prod_r a_{j_r,j_{\sigma(r)}}.$$ Summing over all $\binom{n}{i}$ possible choices of $j_1,\ldots,j_i$ then gives $$E_i = \sigma_i(x_1,\ldots,x_n) + R_i(x_1,\ldots,x_n)$$ where, explicitly, $R_i = \sum_{j_1 < \cdots < j_i} \sum_{\sigma \neq \text{id}} \text{sgn}(\sigma) \prod_{r} a_{j_r,j_{\sigma(r)}}$. If $\sigma \in S_i$ is not the identity, then $\sigma$ has at most $i-2$ fixed points, so at most $i-2$ of the terms $a_{j_r,j_{\sigma(r)}}$ are equal to some $x_k$, so $R_i$ has degree at most $i-2$ when $i \ge 2$.

To conclude we just equate coefficients. We have $E_i = \sigma_i(\lambda_1,\ldots,\lambda_n)$, so in order for $A$ to have the prescribed entries and eigenvalues it is necessary and sufficient to have equations $$\sigma_i(x_1,\ldots,x_n) + R_i(x_1,\ldots,x_n) = \sigma_i(\lambda_1,\ldots,\lambda_n) \qquad (i=1,\ldots,n).$$