In short: why is the determinant of a symmetric matrix an irreducible polynomial in the upper-triangular elements as variables?
The determinant of a square matrix $(x_{ij}) \in \Bbb F^{n \times n}$ is of course a polynomial $p$ of degree $n$ in the $n^2 $ variables $x_{ij}$. This polynomial is irreducible, a fact which is proved nicely here.
When looking at symmetric polynomials, the determinant is a different polynomial $f$ in $\frac{n(n+1)}{2}$ variables, namely $(x_{ij})_{i\leq j}$. Note that $f$ is a quadratic polynomial in each variable, whereas $p$ is linear in each variable.
How do we know that $f$ is an irreducible polynomial too? This post gives a seemingly-simple proof that I don't get:
Suppose by contradiction that $f=gh$. Then we claim that $f=q^2$ for some polynomial $q$ in the variables $(x_{ij})_{i\leq j}$. To justify this, we look at the identity $\det (AA^T)=\det(A)^2$, so we know that $p(x_{ij})^2$ is always equal in value to $f(y_{ij})_{i\leq j}$, where $y_{ij}=\sum \limits_k x_{ik}x_{kj}$. Since the variables $y_{ij}, x_{ij}$ are not the same, I don't understand how we can claim from the identity $p(x_{ij})^2=f(y_{ij})$ that the required $q$ exists, and I also don't see where we use our assumption by way of contradiction that f is reducible.
I arrived at the question in the book by Shafarevich, where he sadly just comments that it is "easy to see":
This can also be done in a similar way as described in the post linked in the question for general square matrices. Letting $f\in\mathbb F[x_{ij}]_{1\le i\le j\le n}$ be the determinant, suppose that it decomposes as $f=gh$. We need to show that $g$ or $h$ is constant. This follows from the following properties of the determinant (when expressed as a linear combination of monomials in the $x_{ij}$),
I will make use of the following simple statements about factorisation in polynomial rings. (i) If a linear term in $k[x]$ factorizes into a product of two polynomials, then one is linear in $x$ and the other is independent of $x$. (ii) If $f=gh$ for $g\in k[x]\setminus k$ and $h\in k[y]\setminus k$ then $fg$ contains monomial terms containing $xy$.
Now, for the argument that if $f=gh$ then either $g$ or $h$ is constant. By property (1) above, for each $i$, either $g$ is first order in $x_{ii}$ and $h$ is independent of $x_{ii}$, or vice-versa.
Choose any $1\le i < j\le n$ such that $g$ is linear in $x_{ii}$. Then:
Supposing that $g$ is linear in $x_{11}$ (wlog) the above argument can be applied to each $i < j$ to show that $h$ is independent of all the indeterminates $x_{ii},x_{jj},x_{ij}$ and, hence, is constant.