I'm working in the following problem in the Bishop book - Pattern Recognition and Machine Learning:
I read the solution and there was a trick to prove twice of the double summation of the anti-symmetric term as zero:
I do not understand the last three lines of the solution. How does this transformation hold? $$ \sum^D_{i=1}\sum^D_{j=1}w^A_{ji}x_ix_j = \sum^D_{j=1}\sum^D_{i=1}w^A_{ji}x_jx_i $$


We have \begin{align*} \sum_{i=1}^D\sum_{j=1}^D\left(w_{ij}^A+w_{ij}^A\right)x_ix_j &=\sum_{i=1}^D\sum_{j=1}^D\left(w_{ij}^A-w_{ji}^A\right)x_ix_j\tag{1}\\ &=\sum_{i=1}^D\sum_{j=1}^Dw_{ij}^Ax_ix_j-\sum_{i=1}^D\sum_{j=1}^Dw_{ji}^Ax_ix_j\tag{2}\\ &=\sum_{i=1}^D\sum_{j=1}^Dw_{ij}^Ax_ix_j-\sum_{j=1}^D\sum_{i=1}^Dw_{ji}^Ax_jx_i\tag{3}\\ &=\sum_{i=1}^D\sum_{j=1}^Dw_{ij}^Ax_ix_j-\color{blue}{\sum_{i=1}^D\sum_{j=1}^Dw_{ij}^Ax_ix_j}\tag{4}\\ &=0 \end{align*}
Comment:
In (1) we use the antisymmetry property $w_{ij}^A=-w_{ji}^A$ with $1\leq i,j\leq D$.
In (2) we multiply out.
In (3) we reorder the terms of the right-hand double sum by exchanging the sums. We also use commutativity $x_ix_j=x_jx_i$.
In (4) we add an intermediate step by substituting in the right-hand double sum $i$ with $j$ and also $j$ with $i$. The last line follows immediately.