Here is the question I am trying to understand its solution:
Let $R = \mathbb Z + x \mathbb Q[x] \subset \mathbb Q[x]$ be the set of polynomials in $x$ with rational coefficients whose constant is an integer.
$(a)$ Suppose that $f(x), g(x) \in \mathbb Q[x]$ are two nonzero polynomials with rational coefficients and that $x^r$ is the largest power of $x$ dividing both $f(x)$ and $g(x)$ in $\mathbb Q[x],$ (i.e., $r$ is the degree of the lowest order term appearing in either $f(x)$ or $g(x)$). Let $f_r$ and $g_r$ be the coefficients of $x^r$ in $f(x)$ and $g(x),$ respectively ( one of which is nonzero by definition of $r$). Then $\mathbb Z f_r + \mathbb Z g_r = \mathbb Z d_r.$ Prove that there is a polynomial $d(x) \in \mathbb Q[x]$ that is a gcd of $f(x)$ and $g(x)$ and whose term of minimal degree is $d_rx^r.$
$(b)$ Prove that $f(x) = d(x)q_1(x)$ and $g(x) = d(x)q_2(x)$ where $q_1(x)$ and $q_2(x)$ are elements of the subring $R$ of $\mathbb Q[x].$
Here are some thoughts:
I know that: $R$ is an integral domain by the previous problem.
How can I show the existence of such $d(x)$? should I prove that it is a Bezout domain? or should I just find it and say hey this is the gcd and prove that it is a gcd?
For letter (b), I do not know what exactly should I do.
Any help will be greatly appreciated!
Edit:
Here is a way of just giving it:
Let $d(x) \in \mathbb Q[x]$ be the greatest common divisor of $f(x)$ and $g(x)$ in $\mathbb Q[x].$ We must have $x^r | d(x)$ by definition. If $x^{r + 1}| d(x)$ then $x^{r + 1}$ divides both $f(x)$ and $g(x),$ a contradiction. Therefore $x^r$ is the lowest term in $d(x).$ Let $d_t$ be the coefficient of this term in $d(x)$. This $d_t$ must divide both $f_r$ and $g_r,$ so $d_t | d_r$ as $\mathbb Z f_r + \mathbb Z g_r = \mathbb Z d_r$ by the given. On the other hand, $d_t \in \mathbb Z f_r + \mathbb Z g_r,$ so $d_r | d_t$ and therefore $d_t = d_r.$
Note that: Since $x^r$ is the largest power of $x$ dividing both $f(x)$ and $g(x),$ we can write $f(x)= f_r x^r + \dots$ and $g(x)= g_r x^r + \dots.$ Without loss of generality assume that $f_r$ is not zero.
Can I use Bezout identity here?if so why? I know that Dummit & Foote gave it in exercise 7 of section 8.2, but why it is applicable here?
How can I rigorously prove that $d(x) = d_r x^r$ ?if this was a correct guess.
Any help will be greatly appreciated!
The statement $\Bbb Z f_r + \Bbb Z g_r = \Bbb Z d_r$ implies (how?) that $f(x)=d_rx^rP(x)$ and $g(x) = d_rx^rQ(x)$, where $P(x), Q(x) \in R$, and the constant coefficients $\hat p,\hat q$ of $P(x), Q(x)$ are coprime.
Let $h(x) \in R$ be the gcd (in $\Bbb Q[x]$) of $P(x)$ and $Q(x)$ with constant coefficient $1$. $h(x)$ exists and is unique. Set $d(x) = d_r x^r h(x)$. Then the polynomials $f(x)/d(x)$, $g(x)/d(x)$ have constant terms $\hat p$ and $\hat q$ respectively.
I think I can also prove that $h$ is a $\Bbb Z$-linear combination of $P(x)$ and $Q(x)$, but that's not asked in the problem.