Exercise: Define a polynomial $\Sigma(X_1,\ldots,X_n)$ as \begin{align*} \Sigma(X_1,\ldots,X_n) = \prod_{i < j} (X_i+X_j) \end{align*} This is a symmetric polynomial, quite clearly. I want to express $\Sigma$ as a polynomial in the elementary symmetric functions, for $n=2,3,4$. Case $n=2$ is trivial: $\Sigma(X_1,X_2)= \sigma_1 = X_1+X_2$. Case $n=3$ is easy. I write $\Sigma$ as follows: \begin{align*} \Sigma(X_1,X_2,X_3) = (\sigma_1 - X_1)(\sigma_1 - X_2)(\sigma_1 - X_3), \end{align*} then apply Vieta's formulas formally, obtaining \begin{align*} \Sigma(X_1,X_2,X_3) = \sigma_1^3 - \sigma_1 \sigma_1^2 + \sigma_2 \sigma_1 - \sigma_3 = \sigma_2 \sigma_1 - \sigma_3. \end{align*} The case $n=4$ seems harder to me, and I'm unable to work it out. I hope there is some trick to avoid unnecessary computations. My attempt is to write down $\Sigma$ completely: \begin{align*} \Sigma = (X_1+X_2)(X_3+X_4)(X_1+X_4)(X_2+X_3)(X_2+X_4)(X_1+X_3) \end{align*} and then multiply factors in groups of two, obtaining: \begin{align*} \Sigma = (\sigma_2 - X_1X_2-X_3X_4)(\sigma_2 - X_1X_4 - X_2X_3)(\sigma_2 - X_2X_4-X_1X_3), \end{align*} but it doesn't seem to help much. How can I solve the exercise?
2026-03-26 11:18:28.1774523908
Express symmetric polynomial $\prod_{i < j} (X_i+X_j)$ in terms of elementary symmetric functions
1.5k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in ABSTRACT-ALGEBRA
- Feel lost in the scheme of the reducibility of polynomials over $\Bbb Z$ or $\Bbb Q$
- Integral Domain and Degree of Polynomials in $R[X]$
- Fixed points of automorphisms of $\mathbb{Q}(\zeta)$
- Group with order $pq$ has subgroups of order $p$ and $q$
- A commutative ring is prime if and only if it is a domain.
- Conjugacy class formula
- Find gcd and invertible elements of a ring.
- Extending a linear action to monomials of higher degree
- polynomial remainder theorem proof, is it legit?
- $(2,1+\sqrt{-5}) \not \cong \mathbb{Z}[\sqrt{-5}]$ as $\mathbb{Z}[\sqrt{-5}]$-module
Related Questions in POLYNOMIALS
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Integral Domain and Degree of Polynomials in $R[X]$
- Can $P^3 - Q^2$ have degree 1?
- System of equations with different exponents
- Can we find integers $x$ and $y$ such that $f,g,h$ are strictely positive integers
- Dividing a polynomial
- polynomial remainder theorem proof, is it legit?
- Polyomial function over ring GF(3)
- If $P$ is a prime ideal of $R[x;\delta]$ such as $P\cap R=\{0\}$, is $P(Q[x;\delta])$ also prime?
- $x^{2}(x−1)^{2}(x^2+1)+y^2$ is irreducible over $\mathbb{C}[x,y].$
Related Questions in SYMMETRIC-POLYNOMIALS
- Symmetric polynomial written in elementary polynomials
- Roots of a polynomial : finding the sum of the squares of the product of two roots
- Find the value of a third order circulant type determinant
- An algebraic inequality involving $\sum_{cyc} \frac1{(a+2b+3c)^2}$
- Show that if $x+y+z=x^2+y^2+z^2=x^3+y^3+z^3=1$ then $xyz=0$
- Form an equation whose roots are $(a-b)^2,(b-c)^2,(c-a)^2.$
- Find the value of $\frac{a+b+c}{d+e+f}$
- Equation System with 4 real variables
- How can I prove the following equality given two constraints?
- Find the minimum value of $f(x,y,z)=\frac{x^2}{(x+y)(x+z)}+\frac{y^2}{(y+z)(y+x)}+\frac{z^2}{(z+x)(z+y)}$.
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Here is an answer for any number $n$ of variables.
For any partition $\lambda$, we let $s_{\lambda}$ denote the Schur function corresponding to $\lambda$. Let $\delta$ be the partition $\left( n-1,n-2,\ldots,1\right) $ of $n\left( n-1\right) /2$. Then, \begin{equation} \prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right) = s_{\delta}\left( x_{1},x_{2},\ldots,x_{n}\right) . \label{1} \tag{1} \end{equation}
This well-known formula (which, I think, is due to Jacobi) can easily be derived from the definition of Schur polynomials through alternants. Let me explain:
For every $n$-tuple $\alpha=\left( \alpha_{1},\alpha_{2},\ldots,\alpha _{n}\right) \in\mathbb{N}^{n}$ of nonnegative integers, we define the alternant $a_{\alpha}$ to be the polynomial \begin{equation} \sum_{\sigma\in S_{n}}\left( -1\right) ^{\sigma}x_{1}^{\alpha_{\sigma\left( 1\right) }}x_{2}^{\alpha_{\sigma\left( 2\right) }}\cdots x_{n}^{\alpha_{\sigma\left( n\right) }} = \det\left( \left( x_i^{\alpha_j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) \end{equation} in $\mathbb{Z}\left[ x_{1},x_{2},\ldots,x_{n}\right] $. If $\alpha=\left( \alpha_{1},\alpha_{2},\ldots,\alpha_{n}\right) $ and $\beta=\left( \beta _{1},\beta_{2},\ldots,\beta_{n}\right) $ are two $n$-tuples in $\mathbb{N} ^{n}$, then the $n$-tuple $\alpha+\beta$ is defined to be $\left( \alpha _{1}+\beta_{1},\alpha_{2}+\beta_{2},\ldots,\alpha_{n}+\beta_{n}\right) $.
Any partition $\lambda=\left( \lambda_{1},\lambda_{2},\ldots,\lambda _{k}\right) $ of length $k\leq n$ will be identified with the $n$-tuple $\left( \lambda_{1},\lambda_{2},\ldots,\lambda_{k},\underbrace{0,0,\ldots ,0}_{n-k\text{ times}}\right) \in\mathbb{N}^{n}$. Notice that $\delta$ is thus identified with the $n$-tuple $\left( n-1,n-2,\ldots,1,0\right) $.
Then, the "alternant formula" for the Schur polynomial $s_{\lambda}\left( x_{1},x_{2},\ldots,x_{n}\right) $ says that \begin{equation} s_{\lambda}\left( x_{1},x_{2},\ldots,x_{n}\right) =\dfrac{a_{\lambda+\delta }}{a_{\delta}} \label{2} \tag{2} \end{equation} whenever $\lambda$ is a partition of length $\leq n$. This is the historically first definition of Schur polynomials, long before the modern definitions via Young tableaux or the Jacobi-Trudi formulas were discovered; its main disadvantages are that it only defines $s_{\lambda}$ in finitely many variables and that it requires proof that $\dfrac{a_{\lambda+\delta} }{a_{\delta}}$ is indeed a polynomial. But this is fine for us. (For a proof of the fact that this definition of $s_\lambda$ is equivalent to the modern combinatorial definition, you can consult Corollary 2.6.6 in Darij Grinberg and Victor Reiner, Hopf algebras in combinatorics, arXiv:1409.8356v5. In the same chapter you will find an exercise proving the Jacobi-Trudi identity, which is yet another popular definition of the Schur polynomials.)
We can now prove \eqref{1}. Indeed, $\delta=\left( n-1,n-2,\ldots,0\right) \in\mathbb{N}^{n}$, so that the definition of $a_{\delta}$ yields \begin{equation} a_{\delta+\delta}=\det\left( \left( x_{i}^{n-j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) =\prod_{1\leq i<j\leq n}\left( x_{i}-x_{j}\right) \label{3} \tag{3} \end{equation} (by the Vandermonde determinant formula). But $\delta+\delta=\left( 2\left( n-1\right) ,2\left( n-2\right) ,\ldots,2\cdot0\right) $, so that \begin{align} a_{\delta+\delta} & =\det\left( \left( x_{i}^{2\left( n-j\right) }\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) =\det\left( \left( \left( x_{i}^{2}\right) ^{n-j}\right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) \nonumber\\ & =\prod_{1\leq i<j\leq n}\left( x_{i}^{2}-x_{j}^{2}\right) \label{4} \tag{4} \end{align} (again by the Vandermonde determinant formula). Dividing \eqref{4} by \eqref{3}, we obtain \begin{equation} \dfrac{a_{\delta+\delta}}{a_{\delta}}=\dfrac{\prod_{1\leq i<j\leq n}\left( x_{i}^{2}-x_{j}^{2}\right) }{\prod_{1\leq i<j\leq n}\left( x_{i} -x_{j}\right) }=\prod_{1\leq i<j\leq n}\underbrace{\dfrac{x_{i}^{2}-x_{j} ^{2}}{x_{i}-x_{j}}}_{=x_{i}+x_{j}}=\prod_{1\leq i<j\leq n}\left( x_{i} +x_{j}\right) . \end{equation} But applying \eqref{2} to $\lambda=\delta$, we obtain $s_{\delta}\left( x_{1},x_{2},\ldots,x_{n}\right) =\dfrac{a_{\delta+\delta}}{a_{\delta}} =\prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right) $. Thus, \eqref{1} is proven.
Finally, we can transform \eqref{1} into an explicit (if you allow determinants) formula for $\prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right) $ in terms of the elementary symmetric polynomials. To that aim, we let $e_{k}\left( \overrightarrow{x}\right) $ denote the $k$-th elementary symmetric polynomial in $x_{1},x_{2},\ldots,x_{n}$. Notice that $e_{k}\left( \overrightarrow{x}\right) =0$ whenever $k<0$ and also whenever $k>n$.
Recall that one of the two Jacobi-Trudi formulas says that if $\lambda$ is a partition of length $\leq n$, then \begin{equation} s_{\lambda^{t}}\left( x_{1},x_{2},\ldots,x_{n}\right) =\det\left( \left( e_{\lambda_{i}-i+j}\left( \overrightarrow{x}\right) \right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) , \end{equation} where $\lambda^{t}$ denotes the conjugate partition of $\lambda$ (this is the partition whose $i$-th entry is the number of entries of $\lambda$ that are $\geq i$, for each $i\in\left\{ 1,2,3,\ldots\right\} $). Applying this to $\lambda=\delta$, we obtain \begin{align*} s_{\delta^{t}}\left( x_{1},x_{2},\ldots,x_{n}\right) & =\det\left( \left( e_{\left( n-i\right) -i+j}\left( \overrightarrow{x}\right) \right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) \\ & =\det\left( \left( e_{n-2i+j}\left( \overrightarrow{x}\right) \right) _{1\leq i\leq n,\ 1\leq j\leq n}\right) \\ & =\det\left( \begin{array} [c]{cccc} e_{n-1}\left( \overrightarrow{x}\right) & e_{n}\left( \overrightarrow{x} \right) & \cdots & e_{2n-2}\left( \overrightarrow{x}\right) \\ e_{n-3}\left( \overrightarrow{x}\right) & e_{n-2}\left( \overrightarrow{x} \right) & \cdots & e_{2n-4}\left( \overrightarrow{x}\right) \\ \vdots & \vdots & \ddots & \vdots\\ e_{-n+1}\left( \overrightarrow{x}\right) & e_{-n+2}\left( \overrightarrow{x}\right) & \cdots & e_{0}\left( \overrightarrow{x}\right) \end{array} \right) . \end{align*} Since $\delta^{t}=\delta$ (this is easy to check), this simplifies to \begin{equation} s_{\delta}\left( x_{1},x_{2},\ldots,x_{n}\right) =\det\left( \begin{array} [c]{cccc} e_{n-1}\left( \overrightarrow{x}\right) & e_{n}\left( \overrightarrow{x} \right) & \cdots & e_{2n-2}\left( \overrightarrow{x}\right) \\ e_{n-3}\left( \overrightarrow{x}\right) & e_{n-2}\left( \overrightarrow{x} \right) & \cdots & e_{2n-4}\left( \overrightarrow{x}\right) \\ \vdots & \vdots & \ddots & \vdots\\ e_{-n+1}\left( \overrightarrow{x}\right) & e_{-n+2}\left( \overrightarrow{x}\right) & \cdots & e_{0}\left( \overrightarrow{x}\right) \end{array} \right) . \end{equation} Thus, \eqref{1} rewrites as \begin{equation} \prod_{1\leq i<j\leq n}\left( x_{i}+x_{j}\right) =\det\left( \begin{array} [c]{cccc} e_{n-1}\left( \overrightarrow{x}\right) & e_{n}\left( \overrightarrow{x} \right) & \cdots & e_{2n-2}\left( \overrightarrow{x}\right) \\ e_{n-3}\left( \overrightarrow{x}\right) & e_{n-2}\left( \overrightarrow{x} \right) & \cdots & e_{2n-4}\left( \overrightarrow{x}\right) \\ \vdots & \vdots & \ddots & \vdots\\ e_{-n+1}\left( \overrightarrow{x}\right) & e_{-n+2}\left( \overrightarrow{x}\right) & \cdots & e_{0}\left( \overrightarrow{x}\right) \end{array} \right) . \end{equation} Note that the matrix on the right hand side is not generally upper-triangular, but it has a lot of zeroes (more or less the whole lower half of its southwestern triangle consists of zeroes), so its determinant is a bit easier to compute than a general $n\times n$ determinant. But I don't think there is a more explicit formula.