Consider an orthogonal transformation between Cartesian coordinate systems in $n$-dimensional space. The $n \times n$ rotation matrix $$R = \left(a_{ij}\right)$$ has $n^2$ entries. These are not independent; they are related by the orthogonality conditions $$a_{ik}a_{jk} = \delta_{ij},$$ which are $\frac{1}{2}\!\!\left(n^2 + n\right)$ independent equations in the $a_{ij}$. Thus, $$n^2 - \frac{1}{2}\!\!\left(n^2 + n\right) = \frac{1}{2}n\left(n-1 \right)$$ of the $a_{ij}$ are left undetermined.
Why is the last step justified? If there are $n$ independent equations, is it always possible to solve them for $n$ unknowns (even if the equations involve the sums of quadratic terms, as they do here)? If so, why? Under what general conditions is it possible to solve a system of $n$ independent equations for $n$ unknowns?
If you have a system of $m$ equations with $M$ unknowns then it is usually possible to solve it locally, meaning that the solution set around a particular point is parametrized by $M-m$ parameters. Precise conditions for "usually" are given by the implicit function theorem, namely functions involved in the equations should be continuously differentiable, and the Jacobian matrix of the system has to be non-singular at the point around which the solution set is parametrized, see http://en.wikipedia.org/wiki/Implicit_function_theorem.
The functions in the orthogonality conditions are quadratic polynomials, so clearly continuously differentiable, but technically one has to check that the Jacobian is non-singular at every point (it is) before doing the subtraction. It is somewhat confusing because the "points" here will be elements of the rotation group, i.e. they are themselves matrices.