I'm trying to prove that the norm of the multilinear symmetric operator $A$ is $\frac{1}{m!}$ where $A$ is defined as:
$$ A(x_1,\dots, x_m) = \frac{1}{m!} \sum_{\sigma \in S_m} \xi_1(x_{\sigma(1)} ) \dots \xi_m(x_{\sigma(m)} )$$
$\xi_i$ are the coordinate functionals ($\xi_i(e_j)=\delta_{ij}$) and $x_i \in l^1$. We consider the following norm on $(l^1)^m$, $\| (x_1, \dots, x_m) \| = max_j \|x_j\|$
Clearly, considering $x_i = e_i ~ \forall i$ I get the inequality $\| A \| \ge \frac{1}{m!}$ but I haven't been able to get the other inequality so far.
How should I proceed?
Aside: It's not a linear operator (for $m > 1$), but a multilinear (and symmetric) operator.
By homogeneity, we can assume that $\lVert x_i\rVert \leqslant 1$ for all $i$. Then split the sum,
\begin{align} \sum_{\sigma \in S_m} \xi_1(x_{\sigma(1)})\cdot \dotsc\cdot \xi_m(x_{\sigma(m)}) &= \sum_{k = 1}^m \Biggl(\sum_{\substack{\sigma \in S_m \\ \sigma^{-1}(m) = k}} \xi_1(x_{\sigma(1)})\cdot \dotsc\cdot \xi_m(x_{\sigma(m)})\Biggr)\\ &= \sum_{k = 1}^m \xi_k(x_m)\cdot\Biggl(\sum_{\substack{\sigma \in S_m \\ \sigma^{-1}(m) = k}}\prod_{\substack{i = 1 \\ i \neq k}}^m \xi_i(x_{\sigma(i)})\Biggr) \end{align}
to set up an induction proof.
The base case $m = 1$ is immediately verified, and then by the induction hypothesis, the sums in parentheses are all bounded in absolute value by $1$, which yields the induction step. (There's a small explanation needed why the sums in the parentheses are instances of the $m-1$ case.)