I'm reading a section on Rotation of Coordinate Systems and this is throwing me off:
'In n-dimensional space, the rotation matrix will have $n^2$ elements, upon which orthogonality relations place $\frac{1}{2}(n^2 + n)$ conditions...'
I understand the orthogonality relation as:
$$\delta_{ij} = \mathbf{e'}_{i} \cdot \mathbf{e'}_{j} = a_{ik}(\mathbf{e}_{k} \cdot \mathbf{e'}_{j}) = a_{ik}a_{jk}$$
Where $a_{ij}$ is the rotation matrix. What I don't understand is how to formally verify that $\frac{1}{2}(n^2 + n)$ is always true. I was thinking this could be shown with induction, but that hasn't really gotten me anywhere.
Induction could work, but a different argument is that each relation you've written is the same as another one, gotten by swapping indices. So among the $n^2$ pairs of indices (which we can arrange in a matrix), the subdiagonal ones are redundant, and the diagonal ones (where $i = j$) are not. So if you divide your matrix into subdiagonal (K) , diagonal(n), and superdiagonal (K) entries, you find that
$$ K + n + K = n^2 \\ K = \frac{1}{2}(n^2 - n). $$
So the independent conditions are, say, the $K$ superdiagonal ones plus the $n$ diagonal ones, which is $K + n = \frac{1}{2}(n^2 + n)$ conditions.