Finding a basis for symmetric $k$-tensors on $V$

4.5k Views Asked by At

We say a function is $k$-linear if it takes $k$ values as input and is linear with respect to each of them. For example, determinant is a $n$-linear function. (If the matrix is $n \times n$)

A tensor is a function $T:V \times V\times V\times \dots\times V\to \mathbb R$ ($k$ vectors taken as input) such that $T$ is $k$-linear. (Its linear with respect to each of its $k$ inputs) ($V$ is a vector space)

A symmetric tensor is a tensor that is invariant under a permutation of its vector arguments. Meaning:
$T(v_1,v_2,\dots,v_r)=T(v_{\sigma(1)},v_{\sigma(2)},\dots,v_{\sigma(r)})$ For each permutation $\sigma$ of the symbols $\{1,2,\dots,r\}$.

We call $Sym^k(V)$ the vector space of all symmetric $k$-tensors on vector space $V$.

If $T$ is a $m$-tensor and $S$ is a $n$-tensor, then $T \otimes S$ is a $m+n$-tensor such that for each $(v_1,\dots,v_m,v_{m+1},\dots,v_{m+n})$ $T \otimes S(v_1,\dots,v_m,v_{m+1},\dots,v_{m+n})=T(v_1,\dots,v_m)S(v_{m+1},\dots,v_{m+n})$.

Now we want to find a basis for this vector space.

I know that the basis should consist of something related to the sum of tensor products of the elements of the basis of the dual space of $V$ (called $V^*$). But i can't see how. The complete Question is written below:

Let $V$ be an $n$-dimensional vector space. Compute the dimension of $Sym^k(V)$. (It can be also found here Page 33)

I know that every $k$-tensor can be written as a linear combination of $\{e^{i_1}\otimes e^{i_2}\otimes\dots\otimes e^{i_k}\}$ such that $1 \le i_1,\dots,i_k\le n$. But i don't know which members should be eliminated to form a basis for just the symmetric $k$-tensors (Not all $k$-tensors).

Note: For example, I know that if $k=3$, A member of the basis of $Sym^3(V)$ is $e^1\otimes e^2 \otimes e^1+e^2\otimes e^1 \otimes e^1+e^1\otimes e^1 \otimes e^2$. But i don't know why! I want explanation. I have the answer... The dimension of $Sym^k(V)$ is ${n+k-1} \choose k$. My problem is that i don't know why this sum and why the coefficient should be the same for those elements in the sum.

2

There are 2 best solutions below

11
On BEST ANSWER

I assume that $e_1,\dots,e_n$ is a basis of $V$ and that $e^1,\dots,e^n$ the associated dual basis of $V^*$.

First, let's consider the case of arbitrary (not necessarily symmetric) tensors. We note that, by linearity, $$ T(v^{(1)}, \dots, v^{(k)}) = T\left( \sum_{i=1}^n v^{(1)}_i e_i, \dots, \sum_{i=1}^n v^{(k)}_i e_i \right) = T\left( \sum_{i_1=1}^n v^{(1)}_{i_1} e_i, \dots, \sum_{i_k=1}^n v^{(k)}_{i_k} e_{i_k} \right) = \\ \sum_{i_1=1}^n \cdots \sum_{i_k=1}^n v^{(1)}_{i_1} \cdots v^{(k)}_{i_k} T\left(e_{i_1}, \dots, e_{i_k} \right) $$ Now, define the tensor $\tilde T$ by $$ \tilde T = \sum_{i_1=1}^n \cdots \sum_{i_k=1}^n T\left(e_{i_1}, \dots, e_{i_k} \right) e^{i_1} \otimes \cdots \otimes e^{i_k} $$ Prove that $\tilde T(v^{(1)},\dots,v^{(k)}) = T(v^{(1)},\dots,v^{(k)})$ for any $v^{(1)},\dots,v^{(k)}$. That is, $\tilde T = T$. We've thus shown that any (not necessarily symmetric) $k$-tensor can be written as a linear combination of $e^{i_1} \otimes \cdots \otimes e^{i_k}$.

The same applies for symmetric tensors. However, if $T$ is symmetric, then $$ T\left(e_{i_1}, \dots, e_{i_k} \right) = T\left(e_{\sigma(i_1)}, \dots, e_{\sigma(i_k)} \right) $$ for any permutation $\sigma$. Thus, we may regroup the above sum as $$ T = \tilde T = \sum_{i_1=1}^n \cdots \sum_{i_k=1}^n T\left(e_{i_1}, \dots, e_{i_k} \right) e^{i_1} \otimes \cdots \otimes e^{i_k} = \\ \sum_{1 \leq i_1 \leq \cdots \leq i_k \leq n} \; \frac 1{\alpha(i_1,\dots,i_k)}\sum_{\sigma \in S_k} T\left(e_{\sigma(i_1)}, \dots, e_{\sigma(i_k)} \right) e^{\sigma(i_1)} \otimes \cdots \otimes e^{\sigma(i_k)} = \\ \sum_{1 \leq i_1 \leq \cdots \leq i_k \leq n} \; \frac 1{\alpha(i_1,\dots,i_k)}\sum_{\sigma \in S_k} T\left(e_{i_1}, \dots, e_{i_k} \right) e^{\sigma(i_1)} \otimes \cdots \otimes e^{\sigma(i_k)} = \\ \sum_{1 \leq i_1 \leq \cdots \leq i_k \leq n} \frac 1{\alpha(i_1,\dots,i_k)} T\left(e_{i_1}, \dots, e_{i_k} \right) \underbrace{\sum_{\sigma \in S_k} e^{\sigma(i_1)} \otimes \cdots \otimes e^{\sigma(i_k)}}_{\text{basis element for } Sym^k(V)} $$ Thus, we have expressed $T$ as a linear combination of the desired basis elements.


${\alpha(i_1,\dots,i_k)}$ counts the number of times any element $(\sigma(i_1),\dots,\sigma(i_n))$ appears in the summation over $\sigma \in S_n$. As the comment below points out, we have $$ \alpha(i_1,\dots,i_k) = m_1! \cdots m_n! $$ where $m_j$ is the multiplicity of $j \in \{1,\dots,n\}$ in the tuple $(i_1,\dots,i_k)$.

3
On

Let's consider $\mathbb{R}^2$ with standard basis $e_1,e_2$.

If $T$ is a symmetric tensor $T:\mathbb{R}^2\times\mathbb{R}^2\times\mathbb{R}^2\to\mathbb{R}$, then we can group basis vectors $e^{i_1}\otimes e^{i_2}\otimes e^{i_3}$ of the tensor power $(\mathbb{R}^2)^{\otimes 3}$ according to whether or not $T$ must send them to the same value:

$$ \begin{array}{rrrr} e_1\otimes e_1\otimes e_1 \\ \hline e_1\otimes e_1\otimes e_2 & e_1\otimes e_2\otimes e_1 & e_2\otimes e_1\otimes e_1 \\ \hline e_1\otimes e_2\otimes e_2 & e_2\otimes e_1\otimes e_2 & e_2\otimes e_2\otimes e_1 \\ \hline e_2\otimes e_2\otimes e_2 \end{array} $$

In other words, the values $T$ takes on any tensor can be determined as long as we know what values $T$ takes on

  1. $e_1\otimes e_1\otimes e_1$,
  2. $e_1\otimes e_1\otimes e_2$,
  3. $e_1\otimes e_2\otimes e_2$,
  4. $e_2\otimes e_2\otimes e_2$.

These are precisely the basis elements $e_{i_1}\otimes e_{i_2}\otimes e_{i_3}$ with $i_1\le i_2\le i_3$.

Conversely, given any four values $a,b,c,d$ we can arrange for $T$ to take these values on the above basis vectors by writing out

$$ \begin{array}{lll} T & = & a(e^1\otimes e^1\otimes e^1) \\ &+ & b(e^1\otimes e^1\otimes e^2+e^1\otimes e^2\otimes e^1+e^2\otimes e^1\otimes e^1) \\ & + & c(e^1\otimes e^2\otimes e^2+e^2\otimes e^1\otimes e^2+e^2\otimes e^2\otimes e^1) \\ & + & d(e^2\otimes e^2\otimes e^2), \end{array} $$

Generalize.