Recently I stumbled upon the following theorem — I'd like to read a comprehensible (i.e. understandable for an engineer) proof for it:
Given a polynomial $F(t)$ of degree $n$, there exists a unique equivalent symmetric, multiaffine polynomial $f(u_1, u_2, \ldots, u_n)$ — $u_i$ all of degree $1$ — satisfying $f(t,t,\dots,t) = F(t)$. This $f(u_1,u_2,\ldots,u_n)$ is called the polar form of $F(t)$.
So for example,
$F(t) = 5t^3 + 6t^2 + 9t - 2$
Is equivalent to
$\begin{align}f(u,v,w) & = c_1 uvw + c_2 uv + c_3 uw + c_4 vw + c_5u + c_6v +c_7w + c_8 \\ & = 5uvw + 2uv + 2uw + 2vw + 3u + 3v + 3w -2 \end{align}$
Which is clearly symmetric (i.e. $f(u,v,w) = f(w,u,v) = f(v,w,u) = \ldots$). This multiaffine (in fact, triaffine) function is equal to $F(t)$ when $u,v,w$ all have value $t$, resulting in $f(t,t,\ldots,t)$.
What I understand, is that the "multiaffine" property says that when you keep every $u_i$ of $f(u_1,u_2,\ldots,u_n)$ constant, except for one $u_j$, then this $f$ is an affine function in this variable $u_j$. So then it would be just a line which is traversed with a constant speed, right?
For some reason, $f(t,t,\ldots,t)$ — which is equal to $F(t)$ as said before — is called the diagonal (form) of $f$. I can imagine a diagonal in a 3D (or 2D) coordinate system where $u=v=w$, this would in a certain sense be a diagonal of the coordinate system. Is this the reason why it's called "diagonal (form)"?
So my question, where can I find a good proof for the above theorem?
Edit: Furthermore, I'd like to refresh my knowledge of affine spaces, functions, combinations... (e.g. why is it that for an affine combination of points resulting in a new point, the coefficients have to add up to 1 — and when combining them into a vector, they have to add up to 0). Any recommendations for readable books/articles are most welcome!
Edit2: Although Gerry has provided a nice solution, I'd like to know why there is such a difference between his proof and Ramshaw's (the link to that proof is in the comments). For some reason, he uses multiple variables for the function $F$, and the multinomial coefficient. Why is that? Oh, and if anybody has some thoughts about accessible literature on affine things, please let me know.
I think you have the right idea as to why the term, "diagonal".
As for the theorem, suppose $$F(t)=a_0t^n+a_1t^{n-1}+\cdots+a_{n-1}t+a_n$$ with $a_0\ne0$. Now pick any term there, say, $a_{n-k}t^k$, and think of all the ways of choosing $k$ of the variables $u_1,u_2,\dots,u_n$, and forming their product; let's say there are $C$ ways to do this (more about $C$, later). Well, if you give each product the coefficient $a_{n-k}/C$, then the sum of these terms will be a symmetric sum of $C$ terms, each with coefficient $a_{n-k}/C$. Now when you let $u_j=t$ for all $j$, you get a sum of $C$ terms, each of which is $(a_{n-k}/C)t^k$, so in total you get $a_{n-k}t^k$, which is what you want.
In your example, $n=3$, let's look at the term $6t^2$. Now $C=3$ because there are 3 ways to choose 2 of the variables $u,v,w$ to form their product, namely, $uv$, $uw$, and $vw$, and $a_1$, the coefficient of $t^2$, is 6, and $6/3=2$, and that's why we want $2uv+2uw+2vw$ in the symmetric polynomial.
In general, what I've called $C$ is the binomial coefficient $n\choose k$ which has the simple formula, $${n\choose k}={n!\over k!(n-k)!}$$ where I take it you are au courant with factorials.