Changing indexes to prove $\Lambda_{ij}=-\Lambda_{ji} \implies \sum_{i,j=1 , i\neq j}^N \Lambda_{ij} =0$.

114 Views Asked by At

We want to prove, for an $N\times N$ matrix $\Lambda$,

$$\Lambda_{ji}=-\Lambda_{ij}\implies\sum_{i,j=1 ; i\neq j}^N \Lambda_{ij}=0$$

My approach:

$$\sum_{i,j=1 ; i\neq j}^N \Lambda_{ij}=\sum_{i=1}^N\left[\sum_{j=1}^{i-1}\Lambda_{ij}+\sum_{j=i+1}^{N}\Lambda_{ij}\right]$$ $$=\sum_{i=1}^N\sum_{j=1}^{i-1}\Lambda_{ij}+\sum_{i=1}^N\sum_{j=i+1}^{N}\Lambda_{ij}$$ $$=\sum_{i=1}^N\sum_{j=1}^{i-1}\Lambda_{ij}-\sum_{i=1}^N\sum_{j=i+1}^{N}\Lambda_{ji}$$

Here, an appropriate change of indexes is needed to make the two sums identical so that we can merge them together and achieve a sum of zeros which equals zero.

But I cannot find that smart change of indexes. Can you help me?

4

There are 4 best solutions below

2
On BEST ANSWER

Here is a calculation according to OPs approach avoiding also empty sums.

We obtain \begin{align*} \color{blue}{\sum_{{i,j=1}\atop{ i\neq j}}^N \Lambda_{ij}} &=\sum_{i=\color{blue}{2}}^N\sum_{j=1}^{i-1}\Lambda_{ij}+\sum_{i=1}^{\color{blue}{N-1}}\sum_{j=i+1}^{N}\Lambda_{ij}\tag{1}\\ &=\sum_{i=2}^N\sum_{j=1}^{i-1}\Lambda_{ij}-\sum_{i=1}^{N-1}\sum_{j=i+1}^{N}\Lambda_{ji}\tag{2}\\ &=\sum_{i=2}^N\sum_{j=1}^{i-1}\Lambda_{ij}-\sum_{j=1}^{N-1}\sum_{i=j+1}^{N}\Lambda_{ij}\tag{3}\\ &=\sum_{i=2}^N\sum_{j=1}^{i-1}\Lambda_{ij}-\sum_{1\leq j<i\leq N}\Lambda_{ij}\tag{4}\\ &=\sum_{i=2}^N\sum_{j=1}^{i-1}\Lambda_{ij}-\sum_{i=2}^N\sum_{j=1}^{i-1}\Lambda_{ij}\tag{5}\\ &\color{blue}{=0} \end{align*}

Comment:

  • In (1) we respect $1\leq i\ne j\leq n$ by setting the lower limit $i=2$ in the left sum and the upper limit $N-1$ in the right sum of the RHS.

  • In (2) we apply $\Lambda_{ij}=-\Lambda_{ji}$.

  • In (3) we exchange indices $i$ and $j$ in the right sum.

  • In (4) we use another notation of the index range which is helpful when exchanging the summation symbols.

  • In (5) we exchange the summation of the right sum.

1
On

Perhaps: $$ \sum\limits_{j=1}^{N-1}\sum\limits_{i=j+1}^N = \sum\limits_{1 \leq j < i \leq N} = \sum\limits_{i=2}^N \sum\limits_{j = 1}^{i-1} $$

2
On

Simply relabeling $(i,j)$ by $(j,i)$, we have

$$ \sum_{\substack{i,j=1\\i\neq j}}^{N} \Lambda_{ij} \stackrel{\text{relabeling}}{=} \sum_{\substack{i,j=1\\i\neq j}}^{N} \Lambda_{ji} \stackrel{\text{antisymmetry}}{=} - \sum_{\substack{i,j=1\\i\neq j}}^{N} \Lambda_{ij}. $$

Since the sum is its own negative, it must reduce to zero.

0
On

A rather different approach: $\Lambda$ is clearly a matrix which is the negative of its own transpose. If $u$ is a vector of all $1$s, then $$ q = u^t \Lambda u$$ is the sum you're computing on the right, except that $q$ includes the sum of the $\Lambda_{ii}$ terms...but those are all zero, so $q$ is equal to the sum you want to compute.

Now look at $q^t$ (which is just $q$, because it's just a number). We have \begin{align} q + q &= q + q^t \\ &= u^t \Lambda u + u^t \Lambda^t u \\ &= u^t (\Lambda + \Lambda^t) u \\ &= u^t {\mathbf 0 } u \\ &= 0 \end{align} whence $q = 0$. No indices needed at all. Of course, the whole argument can be written out with indices as well, but once you've done the summation-swapping to prove that matrix multiply is associative, why re-do it?