A classical exercise in basic linear algebra is finding the dimension of the space of $n\times n$ magic squares. The solution usually goes via looking at the defining equation, namely, by noticing that one can reconstruct the whole square from its $(n-1)\times(n-1)$ corner subsquare and its line-sum and that these $(n-1)^2+1$ parameters are subject to just 2 linearly independent equations (one can also consider zero-sum magic squares, which form a codimension 1 subspace, thus slightly simplifying things).
However, the cases $n=1,2,3$ can be figured out essentially without solving systems of linear equations. On the other hand, the dimension of the space of $n\times n$ magic squares is bounded from above by $n^2$. Which raises the following question:
Can one show that the dimension of the space of $n\times n$ magic squares is a polynomial in $n$ without actually calculating it? Can one do it without talking about the rank of a matrix/kernel-image dimensions formula and similar things?
If yes, this polynomial must be at most quadratic, and then one can find it from 3 of its values at $n=1,2,3$ by means of polynomial interpolation.