How to deduce: $T(x_1, \dots , x_i+x_j, \dots, x_i+x_j, \dots , x_n) = 0$?

101 Views Asked by At

studying about alternating multilinear applications I came across this expression. I understand that it represents the demonstration that an application T is antisymmetric (since it changes sign when changing the order of elements) and alternating (since it cancels out when 2 elements are equal), I think I understand that.

My Doubt is: if they tell you that the definition of a multilinear application is:

definition of multilinear application (warning: that came from a text in Spanish but you understand the mathematics).

There you see that, for the sum of 2 terms, the individual application of T for each addend (u, v) is always taken, with the other terms (without addends w) that are as entries of T.

I would like a formal definition for when there are more than 2 addends as inputs to a multilinear application, as happens in:

$T(x_1, \dots , x_i+x_j, \dots, x_i+x_j, \dots , x_n) = 0$

\begin{align*} 0=T(x_1&,\ldots,x_i,\ldots,x_i,\ldots,x_n) + \\ &T(x_1,\ldots,x_i,\ldots,x_j,\ldots,x_n)+\\ &T(x_1,\ldots,x_j,\ldots,x_i,\ldots,x_n)+\\ &T(x_1,\ldots,x_j,\ldots,x_j,\ldots,x_n). \end{align*}

I suppose the idea is to apply, separately to each addend, the additive form of the other addends present in the operation, simultaneously, to cover them all.

I know that I may not express myself in the best way but I am precisely here to clear up doubts, I appreciate the help.

1

There are 1 best solutions below

6
On BEST ANSWER

You are correct that the way to get it with two inputs which are sums is to apply the linearity in each slot twice. We first have \begin{align*} T(x_1,\ldots, \color{red}{x_i+x_j},\ldots, \color{blue}{x_i+x_j},\ldots, x_n) & = T(x_1,\ldots, \color{red}{x_i},\ldots,\color{blue}{x_i+x_j},\ldots,x_n) \\ & + T(x_1,\ldots,\color{red}{x_j},\ldots, \color{blue}{x_i+x_j},\ldots, x_n).\end{align*} Here we just applied linearity in slot $i$.

Then we apply linearity in slot $j$ to the first term to get \begin{align*} T(x_1,\ldots, \color{red}{x_i},\ldots, \color{blue}{x_i+x_j},\ldots, x_n) & = T(x_1,\ldots, \color{red}{x_i},\ldots,\color{blue}{x_i},\ldots,x_n) \\ & + T(x_1,\ldots,\color{red}{x_i},\ldots, \color{blue}{x_j},\ldots, x_n)\end{align*}

and we apply linearity in slot $j$ to the second term to get

\begin{align*} T(x_1,\ldots, \color{red}{x_j},\ldots,\color{blue}{ x_i+x_j},\ldots, x_n) & = T(x_1,\ldots, \color{red}{x_j},\ldots,\color{blue}{x_i},\ldots,x_n) \\ & + T(x_1,\ldots,\color{red}{x_j},\ldots, \color{blue}{x_j},\ldots, x_n)\end{align*}

In addition to confirming this, it sounds like you may be looking for something which doesn't deal with the slots one at a time.

Suppose we have vectors $x_1,\ldots, x_n$ and we take $y_1,\ldots, x_n\in \text{span}\{x_1,\ldots, x_n\}$, so we can write $$y_1=\sum_j c_1^j x_j,$$ $$y_2=\sum_j c^j_2 x_j,$$ $$\vdots$$ $$x_n=\sum_j c^j_n x_j.$$ Here the $j$ in $c^j_i$ is an index, not an exponent.

Then $$T(y_1,\ldots,y_n)=\sum_{(j_i)} T(x_{j_1},\ldots, x_{j_n})\prod_{i=1}^n c_i^{j_i},$$ where the sum is taken over all tuples $(j_i)_{i=1}^n\in\{1,\ldots, n\}^n$. This could be taken to be the definition of multilinear, since it is equivalent to being linear in each slot. And in fact the way that you could prove this is by induction, applying the linearity one slot at a time.

Something special happens if $T$ is alternating, because in that case, each index $(j_i)_{j=1}^n$ which has a repeated index satisfies $$T(x_{j_1},\ldots, x_{j_n})=0.$$ This is because, as you said, an alternating multilinear function must be zero when it has a repeated entry. This is because if you swap the entry, you get the same value, but you also get the opposite of the value, which can only happen if the value is $0$. In other words, $$T(x_1,\ldots,\color{red}{x_i},\ldots, \color{blue}{x_i},\ldots, x_n)=-T(x_1,\ldots,\color{blue}{x_i},\ldots, \color{red}{x_i},\ldots, x_n).$$ Here the colors represent "switching", but since we switched $x_i$ and $x_i$, we just get the same thing. This implies $T(x_1,\ldots, x_n)=0$.

So in the alternating case, it suffices to take the sum over $(j_i)_{i=1}^n$ where each $j_i$ is distinct. This is the same as taking the sum over all permutations $\sigma:\{1,\ldots,n\}\to \{1,\ldots,n\}$. For an alternating multilinear function $T$ and any permutation $\sigma$, $$T(x_{\sigma(1)}, \ldots,x_{\sigma(n)})=\text{sgn}(\sigma)T(x_1,\ldots, x_n),$$ where $\text{sgn}(\sigma)$ is the parity of the permutation. In this case, we get $$T(y_1,\ldots, y_n)=T(x_1,\ldots, x_n)\sum_\sigma \text{sgn}(\sigma)\prod_{i=1}^n c^{\sigma(i)}_i,$$ and we recognize $$\sum_\sigma\text{sgn}(\sigma)\prod_{i=1}^n c^{\sigma(i)}_i = \text{det}\begin{pmatrix} c^1_1 & c^2_1 & \ldots & c^n_1 \\ c^1_2 & c^2_2 & \ldots & c^n_2 \\ \vdots & \vdots & \ddots & \vdots \\ c^1_n & c^2_n & \ldots & c^n_n\end{pmatrix},$$ so $$T(y_1,\ldots,y_n)=\sum_\sigma T(x_1,\ldots,x_n)\text{sgn}\prod_{i=1}^n c_i^{\sigma(i)}=\text{det}\begin{pmatrix} c^1_1 & c^2_1 & \ldots & c^n_1 \\ c^1_2 & c^2_2 & \ldots & c^n_2 \\ \vdots & \vdots & \ddots & \vdots \\ c^1_n & c^2_n & \ldots & c^n_n\end{pmatrix}T(x_1,\ldots,x_n).$$

EDIT: Adding some further elaboration about the indices $(j_i)_{i=1}^n$, and about the question in general. As above, suppose we have $y_i=\sum_j c_i^j x_j$ for $i=1,\ldots,n$. Let's take a look at the specific case $n=2$, so $y_1=c^1_1x_1+c^2_1x_2$ and $y_2=c^1_2x_1+c^2_2x_2$. Using the multilinearity, we get \begin{align*} T(y_1,y_2) & = T(c^1_1x_1+c^2_1x_2,y_2) = c^1_1T(x_1,y_2)+c^2_1T(x_2,y_2) \\ & = \color{red}{c^1_1 T(x_1,c^1_2x_1+c^2_2x_2)} + \color{blue}{c^2_1 T(x_2,c^1_2 x_1+c^2_2x_2)} \\ & = \color{red}{c^1_1c^1_2 T(x_1,x_1)+c^1_1c^2_2 T(x_1,x_2)} \\ & +\color{blue}{c^2_1c^1_2 T(x_2,x_1)+ c^2_1c^2_2T(x_2,x_2)}.\end{align*} The colors are to show which terms were split into which terms, because it's notationally tedious with all of the indices. So we get four terms, $T(x_1,x_1)$, $T(x_1,x_2)$, $T(x_2,x_1)$, and $T(x_2,x_2)$. I'll think about these as the $(1,1)$, $(1,2)$, $(2,1)$, and $(2,2)$ terms. These are the values that $(j_1,j_2)$ will range over in the $n=2$ case.

Just a side note: We chose our labeling convention so that $y_i=\sum_j c^j_i x_i$, so for the term $c^j_i$, the $i$ is telling us what $y$ it goes with, and the $j$ is telling us what $x$ it goes with. Because of that, each expression above has coeffcients $c_1^\cdot c_2^\cdot$ (that is, each lower index $i$ occurs on exactly one of the factors) because each $y_i$ is contributing exactly one of those factors. Also, if we arrange the coefficients in the product so that their lower indices are in order, the upper indices match the $x$ indices on the $x$ terms. To get all possible terms, we must go through all possible ways to select one thing from each column. Since each column has two terms, there are $2\times 2=4$ ways.

Let's write it another way: $$T(y_1,y_2) = T\begin{pmatrix} c^1_1x_1 & & c^1_2x_1 \\ + & , & +\\ c^2_1x_2 & & c^2_2x_2\end{pmatrix}.$$ Each of the four terms above is obtained by choosing one terms from the left column and one terms from the right. That is, $c^1_1c^1_2T(x_1,x_1)$ is what we get when we choose the top element from both columns, $c^1_1c^2_2 T(x_1,x_2)$ is the term we get when we choose the top element from the first column and the bottom element from the second, etc.

For $n=3$, we'll get $27$terms. Say $y_i=c^1_ix_1+c^2_ix_2+c^3_ix_3$ for $i=1,2,3$. Then we'll get $$T(y_1,y_2,y_3)=T\begin{pmatrix}c^1_1x_1 & & c^1_2 x_2 & & c^1_3x_1 \\ + & & + & & + \\ c^2_1x_1 & , & c^2_2 x_2 & ,& c^2_3x_1 \\+ & & + & & + \\ c^3_1x_1 & & c^3_2 x_2 & & c^3_3x_1\end{pmatrix}.$$ There are $3\times 3 \times 3 = 27$ ways to choose one term from each column. These $27$ triples $(1,1,1)$, $(1,1,2)$, etc., are the $(j_1,j_2,j_3)$ that the sum should range over.

Each term in the expression will have the form $c_1^jc_2^kc_3^l T(x_j,x_j,x_l)$ (so again, each of the lower indices $1,2,3$ occurs exactly once among the factors, and when we arrange the lower indices in order, the upper indices match the indices on the $x$).

In general, we will have $n$ columns, each with $n$ terms. There are $n^n$ ways to choose one terms from each column, and those are all of the expressions we get. We will have expresions $c_1^{j_1}c_2^{j_2}\ldots c_n^{j_n}T(x_1,x_2,\ldots, x_n)$, as $j_1$ ranges over all values $\{1,\ldots, n\}$, $j_2$ ranges over all values $\{1,\ldots, n\}$, and so on.

Here, since we did not assume anything about $T$. We did not assume it is symmetric or alternating, so we don't have information about whether or not $T(x_1,x_2)$ and $T(x_2,x_1)$ are related, and we don't know anything about $T(x_1,x_1)$, $T(x_2,x_2)$, etc. If $T$ is alternating, then we know $T(x_2,x_1)=-T(x_1,x_2)$, and we can combine these terms in the final sum. Also, in the case that $T$ is alternating, $T(x_1,x_1)=T(x_2,x_2)=0$, so we just get rid of those terms in that special case.