This question is from Pattern Recognition and Machine Learning:
Exercise 1.14 Show that an arbitrary square matrix with elements $w_{ij}$ can be written in the form $w_{ij}=w^S+{ij}+w^A_{ij}$ where $w^S_{ij}$ and $w^A_{ij}$ are symmetric and anti-symmetric matrices, respectively, satisfying $w_{ij}^S=w_{ji}^S$ and $w_{ij}^A=-w_{ji}^A$ for all $i$ and $j$. Now consider the second order term in a higher order polynomial in $D$ dimensions, given by $$\sum^D_{i=1}\sum^D_{j=1}w_{ij}x_ix_j$$
Show that $$\sum^D_{i=1}\sum^D_{j=1}w_{ij}x_ix_j=\sum^D_{i=1}\sum^D_{j=1}w_{ij}^Sx_ix_j$$
so that the contribution from the anti-symmetric matrix vanishes.
There is a solution from the internet which I saw and I don't completely understand one step. Here's the solution from the beginning:
First we will show that an arbitrary square matrix with elements $w_{ij}$ can be written in the form $w_{ij}=w_{ij}^S+w_{ij}^A$. We can write $w_{ij}$ in two forms:
$$w_{ij}=w_{ij}^S+w_{ij}^A$$
$$w_{ji}=w_{ij}^S-w_{ij}^A$$
so
$$w_{ij}^S=\frac{w_{ij}+w_{ji}}{2}$$
and
$$w_{ij}^A=\frac{w_{ij}-w_{ji}}{2}$$
If we sum them up we can easily see that $w_{ij}=w_{ij}^S+w_{ji}^A holds.
In the next step we'll shwo that $\sum^D_{i=1}\sum^D_{j=1}w_{ij}x_ix_j=\sum^D_{i=1}\sum^D_{j=1}w_{ij}^Sx_ix_j$. So we wish to show that contribution from the anti-symmetric matrix vanishes. Which is equivalent to saying:
$$\sum^D_{i=1}\sum^D_{j=1}w_{ij}^Ax_ix_j=0$$
If we expand the previous, we get:
$$\sum^D_{i=1}\sum^d_{j=1}w_{ij}^Ax_ix_j=\sum^D_{i=1}\sum^D_{j=i+1}w_{ij}^Ax_ix_j+\sum^D_{i=1}\sum^D_{j=i+1}w^A_{ji}x_ix_j+\sum^D_{j=1}w_{ii}^Ax_i^2$$
It is straightforward to see that the last term in the above equation vanishes because $w_{ij}^A=-w^A_{ji}$.
My question is, why does $\sum^D_{j=1}w_{ii}^Ax_i^2$ vanish?
Specialize $w_{ij} = -w_{ji}$ by $j \mapsto i$, so we have $w_{ii} = -w_{ii}$. But the only number equal to its negative is zero, so all the $w_{ii}$ are zero.