Minimal Number of Polynomials to Determine a $2 \times 3$ Matrix up to Row Exchanges

282 Views Asked by At

Suppose I have $2 \times 3$ matrix of real variables $$ X=\begin{bmatrix} x_1 & y_1 & z_1 \\ x_2 & y_2 & z_2 \end{bmatrix}.$$

I am interested in finite sequences of polynomials $p_1(X), p_2(X),\dots p_N(X)$ (in the entries of $X$) that allow for recovering $X$ up to row exchanges (precisely). In particular, I am looking for such sequences with the fewest terms $N_\text{min}$.

My attempt:

A reasonable candidate is the elementary multisymmetric functions, which arise via expanding $$(1+x_1 x+y_1 y+z_1 z)(1+x_2 x+y_2 y+z_2 z) =\sum_\alpha e_{\alpha}(X) x^{\alpha_1} y^{\alpha_2} z^{\alpha_2}. $$ Specifically, we get $$ \begin{align} &1\\ &+(x_1+x_2) x+(y_1+y_2)y+(z_1+z_2)z \\ &+(x_1 x_2)x^2+(x_1 y_2+x_2 y_1) x y +(x_1 z_2+x_2 z_1) x z+(y_1 y_2) y^2+(y_1 z_2+y_2 z_1)yz+(z_1z_2)z^2, \end{align}$$ with $N=9$ elementary multisymmetric functions $$\begin{align} e_{(1,0,0)}&=x_1+x_2, \\ e_{(0,1,0)}&=y_1+y_2, \\ e_{(0,0,1)}&=z_1+z_2, \\ e_{(2,0,0)}&=x_1 x_2, \\ e_{(1,1,0)}&=x_1 y_2 + x_2 y_1, \\ e_{(1,0,1)}&= x_1 z_2+x_2 z_1, \\ e_{(0,2,0)}&=y_1 y_2, \\ e_{(0,1,1)}&=y_1 z_2+y_2 z_1, \\ e_{(0,0,2)}&= z_1 z_2. \end{align} $$

I am hoping that $N=6$ polynomials suffice (i.e. the number of variables). Here is my reasoning:

In the $2 \times 2$ case, with

$$X=\begin{bmatrix} x_1 & y_1 \\ x_2 & y_2 \end{bmatrix}, $$ the elementary multisymmetric functions are

$$ \begin{align} e_{(1,0)}&=x_1+x_2 ,\\ e_{(0,1)}&= y_1+y_2 ,\\ e_{(2,0)}&=x_1 x_2, \\ e_{(1,1)}&= x_1 y_2 +x_2 y_1, \\ e_{(0,2)}&= y_1 y_2, \end{align} $$ with $N=5$. However, expanding the complex polynomial $$(Z-(x_1+y_1 i))(Z-(x_2+y_2 i)), $$ gives $N=4$ polynomials instead: $$\begin{align} &x_1+x_2 \\ &y_1+y_2 \\ &x_1 x_2-y_1 y_2 \\ &x_1 y_2+x_2 y_1 \end{align} $$ corresponding to the real and imaginary parts of the complex elementary symmetric polynomials $$e_1(x_1+y_1 i,x_2+y_2 i), \\ e_2(x_1+y_1i,x_2+y_2i).$$

I know that $n=2$ is different, in the sense that there is no 3-dimensional analogue of the field $\mathbf{C}$ over $\mathbf{R}$. Nevertheless, I couldn't rule out more efficient approaches than the elementary multisymmetric functions.

Remark: This question is a subproblem of my question on MathOverflow, dealing with the $m \times n$ case.

Thank you for any help.

1

There are 1 best solutions below

1
On

The best known result for the $2 \times 3$ case is that $N_\text{min} = 8$, which can be achieved by using the Plücker coordinates of the matrix $X$.

It is also known that $N_\text{min} \leq 9$, with the elementary multisymmetric polynomials being a possible choice of polynomials.

As for your approach using complex numbers, it is indeed possible to do better than the elementary multisymmetric polynomials in the $2 \times 2$ case. However, this is not possible in the $2 \times 3$ case, as the Plücker coordinates already give a set of 8 polynomials that are algebraically independent over $\mathbf{C}$. (In fact, it is known that the Plücker coordinates are algebraically independent over $\mathbf{R}$ as well.)

Thus, your approach using complex numbers cannot give a set of polynomials with fewer than 8 terms.