Maxwell's theorem (after James Clerk Maxwell) says that if a function $f(x_1,\ldots,x_n)$ of $n$ real variables is a product $f_1(x_1)\cdots f_n(x_n)$ and is rotation-invariant in the sense that the value of the function depends on $x_1,\ldots,x_n$ only through $x_1^2+\cdots+x_n^2$, then it's an exponential function of that sum of squares (so $f$ is a "Gaussian function").
Say a class of undergraduates knows only enough mathematics to understand what the theorem says (so maybe they've never heard of inner products or orthogonal matrices), but they're bright and can understand mathematical arguments. What proof of Maxwell's theorem do you show them? Can you keep it short and simple?
I assume the $C^1$ regularity of all functions and the fact that $f$ is never zero. I think that it can be proved that if $f$ is somewhere zero, then it is everywhere zero. The idea I came up with is the following:
$$f(x_1,\ldots,x_n)=f_1(x_1)\cdots f_n(x_n)=\phi(x_1^2+\cdots+x_n^2)$$
Denote $r^2=x_1^2+\cdots+x_n^2$. Differentiate with respect to $x_i$ and get
$$f_1(x_1)\cdots f_i'(x_i)\cdots f_n(x_n)=\phi'(r^2)2x_i$$
Divide by $f$ and obtain
$$\frac{f_i'(x_i)}{f_i(x_i)}\frac{1}{2x_i}=\frac{\phi'(r^2)}{f(x_1,\ldots,x_n)}$$
Therefore the LHS is independent of $i$. Since each $f_i$ depends only of $x_i$ it follows that there exists a constant $C_i$ such that $$\frac{f_i'(x_i)}{f_i(x_i)}\frac{1}{2x_i} =C_i$$ From here it is easy to get $f_i=C_ie^{C_i x_i^2}$ from where the conclusion follows. $\ $ $\ $