Given quadratic integral relations for $y$ and $z$, find explicit integral relations for $y+z$ and $yz$

155 Views Asked by At

$A$ $\subset$ $B$ is a ring extension. If $y,z \in B$ satisfy quadratic integral dependence relations $y^2+ay+b=0$ and $z^2+cz+d=0$ over $A$, find explicit integral dependence relations for $y+z$ and $yz$

I used the hint provided and assumed that $1/2 \in A$, and used the quadratic formula for obtaining $y$ and $z$. Then I calculated $y+z$. After a really cumbersome calculation (trivial though) I managed to obtain an integral relation for $y+z$. However, for $yz$, I think there should be an easier way because even for $y+z$, which includes two square roots the calculation was long. I appreciate any solution or hint.

3

There are 3 best solutions below

0
On BEST ANSWER

You might be interested in the resultant construction. It's main advantage is that it generalizes to arbitrary integral relation degrees (where explicit factoring gets ugly fast) while keeping all of your computations in $A$. To my knowledge, there is no such comparable method for the product of integral elements.

Let $f = \sum_{i=0}^{n} f_i x^i, g = \sum_{j=0}^m g_j x^j$ be polynomials in $A[x]$.

The resultant $Res_x(f(x), g(x))$ of two polynomials is defined as $$Res_x(f(x), g(x)) = f_n^n g_m^m\prod (\alpha_i -\beta_j)$$ where $\alpha$ and $\beta$ range over the roots of $f$ and $g$, respectively.

(N.B. you can always find a ring extension of $B$ in which $f$ and $g$ totally factor, since they are monic, even if $B$ is not a domain. See Bourbaki's Commutative Algebra early Chapter V or a standard reference on integral dependence for details. Of course in a domain the algebraic closure of the quotient field will do).

A fundamental property of the resultant is that it is also the determinant of the Sylvester Matrix of $f$ and $g$, i.e. the $(n+m) \times (n+m)$ matrix

$$\begin{bmatrix} f_n & \cdots & f_0 & \\ \ddots & & \ddots & \\ & f_n & \cdots & f_0 \\ g_m & \cdots & g_0 & \\ \ddots & & \ddots & \\ & g_m & \cdots & g_0 \end{bmatrix}$$

Now let $a,b \in B$ be integral over $A$ with respective integral relations $f,g \in A[x]$.

We'll show that the resultant $h = Res_z(f(z), g(x-z))$ is a monic polynomial in $A[x]$ such that $h(a+b) = 0$

We compute $$h = \prod (\alpha_i + \beta_j - x)$$ Since $a = \alpha_i, b=\beta_j$ for some $i,j$, it is clear that $h(a+b)= 0$, and that $h$ is monic (though you might adjust for sign). Meanwhile we can see that $h$ is a polynomial in $A[x]$ from the determinant of Sylvester Matrix characterization, so $h \in A[x]$ is indeed an integral relation for $a+b$.

An advantage to working this way is that all of your computations to produce $h$ remain in $A[x]$, and in your case of quadratic dependences, calculating the determinant of a $4 \times 4$ matrix is not so bad.

You can do something similar to get an integral relation for the product of integral elements. We only needed the Sylvester's Matrix determinant characterization of the resultant to show that computations can stay in $A$. You could also take a different approach and argue that the resultant is a symmetric polynomial in $\alpha_i$ and $\beta_j$ and use Vieta's formulas (which work for any polynomials with regular leading coefficient!!) to demonstrate that the resultant is a polynomial in $A[x]$.

This arguments transfers just as well to the polynomial $\prod_{i,j}(x-\alpha_i \beta_j)$, which is therefore an integral relation for $ab$.

0
On

Write $yz\cdot\begin{pmatrix}1\\y\\z\\yz\end{pmatrix}= A\begin{pmatrix}1\\y\\z\\yz\end{pmatrix}$, where $A$ is a $4\times 4$-matrix whose entries depend on $a,b,c,d$ (one can easily calculate this). Now one can observe that $yz$ is an eigenvalue of $A$, hence it annuls its characteristic polynomial. The same can be done for $y+z$.

0
On

Let $y=y_1$, $y_2$ the roots of the first equation and $z=z_1$, $z_2$ the roots of the second equation. Then $x=y+z$ is a root of $$(X-(y_1+z_1))(X-(y_1+z_z))(X-(y_2+z_1))(X-(y_2+z_2))$$ Expand and express the coefficients of this polynomial in terms of $a$,$b$, $c$, $d$.

If you just want a quick result, use elimination with Groebner bases and some software. The result is fairly long, an equation of degree $4$ in $x$. See results with WA here.