Uniqueness of second -and fourth-order moment tensors of vectors

297 Views Asked by At

Let $\{v_1, v_2, v_3\}$ and $\{w_1,w_2,w_3\}$ be two sets of vectors in $\mathbb{R}^2$; we can assume the $v$s are pairwise linearly independent, and likewise for the $w$s.

I'd like to show that if $$\sum_i v_i^{\otimes 4} = \sum_i w_i^{\otimes 4}$$ and $$\sum_i v_i^{\otimes 2} = \sum_i w_i^{\otimes 2},$$ then $v_i = \pm w_{\sigma(i)}$ for some permutation $\sigma$, i.e., the two sets contain the same vector, up to possibly flipping some of the vector signs.

Numerically, this statement appears to be true (and I have concrete counterexamples that show that both conditions are necessary; equality of the rank-four tensors is not enough.) How can I prove it?

EDIT: Per the comments, some clarification on the notation: let $e_k$ denote the $k$th Euclidean basis vector in $\mathbb{R}^2$, i.e. $e_1 = \begin{bmatrix}1 & 0\end{bmatrix}^T.$ Then the first condition is equivalent to:

$$\sum_i (v_i \cdot e_j)(v_i \cdot e_k)(v_i \cdot e_\ell)(v_i \cdot e_m) = \sum_i (w_i \cdot e_j)(w_i \cdot e_k)(w_i \cdot e_\ell)(w_i \cdot e_m)\quad \forall j,k,\ell,m \in \{1,2\}.$$

The second condition is equivalent to:

$$\sum_i v_i v_i^T = \sum_i w_i w_i^T$$

or, put another way,

$$\sum_i (v_i \cdot e_j)(v_i \cdot e_k) = \sum_i (w_i \cdot e_j)(w_i \cdot e_k)\quad \forall j,k\in\{1,2\}.$$


ADDITIONAL EDIT: It might be useful to look at the analogous problem in one dimension lower: for two pairs of numbers $\{v_1,v_2\}$ and $\{w_1,w_2\}$ in $\mathbb{R}$, it is true that $$v_1^4 + v_2^4 = w_1^4 + w_2^4,\qquad v_1^2 + v_2^2 = w_1^2 + w_2^2$$ together imply that $$\{v_1,v_2\} = \{\pm w_1, \pm w_2\}.$$

The proof is easy but not especially enlightening: you can eliminate one of the variables and then solve the quartic equation for the other. Maybe if an elegant proof can be found in this 1D setting though, it could be adapted to the original problem of two trios of vectors in $\mathbb{R}^2$?

1

There are 1 best solutions below

0
On

This is a plausibility argument, not a proof.

Both your tensors are symmetric, i.e permutation of indices does not change the value.

The symmetric 2d tensor has 3 independent components.

  1. j=k=1
  2. j=k=2
  3. i=1 j=2

The symmetric 4d tensor has 5 independent components.

  1. j=k=l=m=1
  2. j=k=l=m=2
  3. j=k=l=1 m=2
  4. j=k=1 l=m=2
  5. j=1 k=l=m=2

Permutations of indices is implied.

Solving for w's (6variables) in terms of the 2d and 4d tensors is over determined. It is possible to have multi solutions with over determined equations. In this case any permutation of v's and any reflection preserves both tensors, so it would reflect in the possible solutions for w's. There would be 6 solutions for any one component of w.

For a full proof perhaps it would be possible write an equation for one component using elimination as a polynomial. Then divide known roots and examine the remainder.(or show that it is a 6th degree polynomial). I couldn't do such a thing.

But in any case having 8 surfaces in 6 variables intersect at a different set of point seems unlikely.