A tensor over $\mathbb R$ is sum of two decomposable tensors

594 Views Asked by At

Let $W$ be a vector space over $\mathbb R$ with basis $\{a,b\}$. Consider the tensor $$a \otimes a \otimes a - b \otimes b \otimes a + a \otimes b \otimes b + b \otimes a \otimes b.$$

a) Show that this tensor cannot be presented as sum of two decomposable tensors.

b) Show that such presentation is possible in complexification of $W$.

My only attempt was to write an equation $$(\alpha a + \beta b)\otimes (\gamma a + \delta b) \otimes (\epsilon a + \zeta b) + (\alpha' a + \beta' b)\otimes (\gamma' a + \delta' b) \otimes (\epsilon' a + \zeta' b) =$$ $$= a \otimes a \otimes a - b \otimes b \otimes a + a \otimes b \otimes b + b \otimes a \otimes b,$$ and from equality of coefficients deduce a system of 8 equalities and 12 variables. I think I can brute force it some way, but I was wondering if there are any other approaches.

2

There are 2 best solutions below

4
On BEST ANSWER

I'll start by stating some common theorems with tensor products.

1. Preliminaries

Theorem 1 (Zero-sum)
Let $U$,$V$ be two vector spaces on $F$, if for $n$ linearly independent vectors $x_1,x_2,...,x_n \in U$, there are $n$ vectors $v_1,...,v_n \in V$ such that: $$x_1 \otimes v_1 +x_2 \otimes v_2+...+x_n \otimes v_n= 0 $$ then $v_1=v_2=...=v_n=0$ $\square$

Theorem 2 (Span)
Let $U$,$V$ be two vector spaces on $F$. $W$ is a subspace of $V$.
Let $ t $ be a tensor in $U \otimes W$ and if for $n$ linearly independent vectors $x_1,x_2,...,x_n \in U$, there are $n$ vectors $v_1,...,v_n \in V$ such that: $$t=x_1 \otimes v_1 +x_2 \otimes v_2+...+x_n \otimes v_n $$ then $\text{span}(v_1,...,v_n) \subset W$
$\square$
Remark 3: We will use mainly theorem 2, the proof of theorem 2 is based on theorem 1. Furthermore, we only need this theorem in the special case where $\dim W=2$

2. Main answer

Assume there are vectors $u_1 , u_2, u_3,v_1,v_2,v_3 \in W$ such that: $$ a \otimes a \otimes a - b \otimes b \otimes a + a \otimes b \otimes b + b \otimes a \otimes b= u_1\otimes u_2 \otimes u_3+u_1\otimes u_2 \otimes u_3 =:t$$ On one hand, we see that: $$ t \in W\otimes( \underbrace{\text{span}( u_2\otimes u_3, v_2 \otimes v_3)}_{=:U})$$ On the other hand, $$ t= a \otimes ( a \otimes a +b \otimes b)+ b\otimes ( -b \otimes a + a \otimes b) $$ Then by theorem 2, we imply that $a \otimes a +b \otimes b$ and $ -b \otimes a + a \otimes b $ lie in $U$.
By theorem 1, none of two above vector is null (and clearly they are independent) , whereas $\dim(U)\le 2$.
Therefore, $\dim(U)=2$ and $ U = \text{span}( a \otimes a +b \otimes b , -b \otimes a + a \otimes b ) $

So there are two elements $p,q \in F$ ( $F$ is the field on which we define $W$, in this case it is either $\mathbb{R}$ or $\mathbb{C}$) such that: $$p (a \otimes a +b \otimes b ) +q( -b \otimes a + a \otimes b ) =u_2 \otimes u_3$$

Or , $$ a \otimes( pa +qb) +b \otimes ( bp-qa) = u_2 \otimes u_3$$ Again, by theorem $2$ ( or by any other straightforward reasoning), as $a,b$ are independent, $pa+qb$ and $bp-qa$ cannot be independent ( as $\dim( \text{span}(u_3)) = 1$)
As $a,b$ are independent, the two above vector are linearly dependent iff $p^2+q^2=0$ This cannot occur in $\mathbb{R}$ unless $p=q=0$ which also leads to a contradiction.

Hence the conclusion for $\mathbb{R}$.

The conclusion for $\mathbb{C}$ (if true) is just a result from some simple calculations.

Remark Indeed, many arguments I presented above can be omitted or shortened given that we are familiar with the tensor product. I just tried to show this point as clear as possible.

0
On

Here is a solution that reduces the problem to linear algebra. (There are probably other solutions.)

Reduction to linear algebra

Let $\{\varphi,\chi\}$ be a basis of $W^*$ with $\varphi(a) = \chi(b) = 1$ and $\varphi(b) = \chi(a) = 0$. Suppose that $$ a \mathbin{\otimes} a \mathbin{\otimes} a - b \mathbin{\otimes} b \mathbin{\otimes} a + a \mathbin{\otimes} b \mathbin{\otimes} b + b \mathbin{\otimes} a \mathbin{\otimes} b = u \mathbin{\otimes} v \mathbin{\otimes} w + x \mathbin{\otimes} y \mathbin{\otimes} z. $$ Applying $\varphi$ and $\chi$ in the first coordinate, we find \begin{align*} a \mathbin{\otimes} a + b \mathbin{\otimes} b &= \varphi(u) v \mathbin{\otimes} w + \varphi(x) y \mathbin{\otimes} z;\\[1ex] -b \mathbin{\otimes} a + a \mathbin{\otimes} b &= \chi(u) v \mathbin{\otimes} w + \chi(x) y \mathbin{\otimes} z.\tag*{$(1)$} \end{align*} Therefore $C := v \mathbin{\otimes} w$ and $D := y \mathbin{\otimes} z$ are rank one matrices in $W \mathbin{\otimes} W$ with the property that both $a \mathbin{\otimes} a + b \mathbin{\otimes} b$ and $-b \mathbin{\otimes} a + a \mathbin{\otimes} b$ are linear combinations of $C$ and $D$. We will show that this is impossible over $\mathbb{R}$ and possible over $\mathbb{C}$. In the latter case, the obtained matrices $C$ and $D$ can easily be used to solve the original problem.

Solution

The preceding remarks show that the problem is equivalent to the following: find $2 \times 2$ matrices $C = (c_{ij})$ and $D = (d_{ij})$, each of rank one, and scalars $\lambda_1,\lambda_2,\mu_1,\mu_2$ such that $$ \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} = \lambda_1 C + \lambda_2 D \qquad \text{and} \qquad \begin{pmatrix} 0 & 1 \\ -1 & 0 \end{pmatrix} = \mu_1 C + \mu_2 D. \tag*{$(2)$} $$ We prove that this is impossible over $\mathbb{R}$. Since $C$ and $D$ have rank one, we must have $\lambda_1,\lambda_2,\mu_1,\mu_2 \neq 0$, and we may choose some non-zero $p \in \ker(C)$. But then it follows from $(2)$ that \begin{align*} p &= \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}p = (\lambda_1 C + \lambda_2 D)p = \lambda_2 Dp;\\[1ex] p' &:= \begin{pmatrix} 0 & 1 \\ -1 & 0 \end{pmatrix}p = (\mu_1 C + \mu_2 D)p = \mu_2 Dp; \end{align*} where $p'$ is $p$ rotated by $-90^\circ$. But then $Dp$ is simultaneously parallel ($Dp = \frac{1}{\lambda_2} p \neq 0$) and perpendicular ($Dp = \frac{1}{\mu_2} p' \neq 0$) to $p$, with $Dp \neq 0$. This is impossible.

Finally, over $\mathbb{C}$ one may take $$ C = \begin{pmatrix} 1 & i \\ -i & 1 \end{pmatrix} \qquad \text{and} \qquad D = \begin{pmatrix} 1 & -i \\ i & 1 \end{pmatrix}. $$ In order to turn this into a solution of the original problem, choose $u,x \in W$ such that $\varphi(u),\varphi(x),\chi(u),\chi(x)$ are the desired scalars in equation $(1)$.

Closing remarks

It can be shown that the approach taken here always works to determine the rank of a $3$-tensor:

Proposition ([Lan12, Thm 3.1.1.1]). Let $T \in U \mathbin{\otimes} V \mathbin{\otimes} W$. Then $\text{rank}(T)$ equals the number of rank one matrices needed to span $T(U^*) \subseteq V \mathbin{\otimes} W$.

Here we interpret $T$ as a linear map $U^* \to V \mathbin{\otimes} W$ in the natural way: if $T = \sum_{i=1}^r u_i \mathbin{\otimes} v_i \mathbin{\otimes} w_i$ and $\psi \in U^*$, then $T(\psi) = \sum_{i=1}^r \psi(u_i) v_i \mathbin{\otimes} w_i$. The proposition is not hard to prove, and generalizes the "reduction to linear algebra" paragraph above to arbitrary $3$-tensors.

References.

[Lan12] J. M. Landsberg, Tensors: Geometry and Applications, Graduate Studies in Mathematics 128, American Mathematical Society, 2012.