find the gcd of polynomials $x^6+x^4+ x^3 + x^2 + x + 1$ and $x^5 + 2x^3 + x^2 + x + 1$ in Q[x]

2.9k Views Asked by At

So, that's my first time attempting a problem of this kind. I followed the steps of the Euclidean algorithm and got the following:

By long division for polynomials we have:

$x^6+x^4+x^3+x^2+x+1$ = $x(x^5+2x^3+x^2+x+1)+(-x^4+1)$

then we repeat the steps to get:

$x^5+2x^3+x^2+x+1$ = $-x(-x^4+1) + (2x^3+x^2+2x+1)$

I'm fine up to that point. But then when I moved on, I saw that my solution is different than the correct answer. Basically I divided $-x^4+1$ by $2x^3+x^2+2x+1$

but I got that: $-x^4+1$ = $-x/2(2x^3+x^2+2x+1) + x^3/2 + x/2 + 1$

whereas the solution says

$-x^4+1$ = $-(x/2 + 1/4) (2x^3+x^2+2x+1) + (3/4 x^2 + 3/4)$

How do they obtain that result? and where am I going wrong?

2

There are 2 best solutions below

1
On BEST ANSWER

As I was saying, we are allowed to multiply or divide by constants at any convenient point, therefore getting rid of fractions. $$ (x^2 + 1)(x^4 +x+1) = x^6 + x^4 + x^3 + x^2 + x + 1 $$ $$ (x^2 + 1)(x^3 +x+1) = x^5 + 2 x^3 + x^2 + x + 1 $$

If, as we think, the two extra factors are coprime, then the original gcd is $x^2 + 1.$ This is a worthwhile confirmation:

$$ (-2x^2+ x - 4)(x^4 + x + 1) + (2 x^3 - x^2 + 2x +1)(x^3 + x + 1) = -3 $$ Notice that the outcome is not $1,$ it is $-3.$ However, this is a nonzero constant in the field. Therefore this is a proof that $x^4 + x + 1, x^3 + x + 1$ are coprime.

Also $$ \scriptsize (-2x^2+ x - 4)(x^6 + x^4 + x^3 + x^2 + x + 1) + (2 x^3 - x^2 + 2x +1)(x^5 + 2 x^3 + x^2 + x + 1) = -3x^2 - 3 = -3(x^2 + 1)$$

0
On

Try rearranging some of the terms with the commutative law of addition and searching for common binomial factors.

Let $P(x) = x^6 + x^4 + x^3 + x^2 + x + 1$. The 6 terms here means that we can split this expression into 3 groups of binomial factors.

$\therefore P(x) = x^6 +x^3 + x^4 + x + x^2 + 1$

$ = x^3(x^3 + 1) + x(x^3 + 1) + (x^2 + 1)$

$= x(x^2+1)(x^3+1) + 1(x^2 + 1)$

$=(x^2 + 1)(x^4 + x + 1)$

Now let $G(x) = x^5 + 2x^3 + x^2 + x + 1$. Split $2x^3$ into $x^3 + x^3$ to form only monic coefficients and turn 5 terms into 6.

$\therefore G(x)= x^5 + x^3 + x^3 + x + x^2 + 1$

$= x^3(x^2 + 1) + x(x^2 + 1) + 1(x^2 + 1)$

$=(x^3 + x + 1)(x^2 + 1)$

The highest common factor should be clear here, but just to clarify we must see if $x^4 + x + 1$ is divisible by $x^3 + x + 1$ i.e. if either of them are prime.

$x^4 + x + 1 = x(x^3 + 1) + 1 = x(x+1)(x^2 - x + 1) + 1$ which is clearly prime.

Hence the greatest common divisor (highest common factor) of $P(x)$ and $G(x)$ is $x^2 + 1$.