Proof verification for Spivak Chapter 10 Problem 27

73 Views Asked by At

I would like some advice on this proof which I have worked out for Spivak's Calculus, Chapter 10 Problem 27(a). The question is as follows:

Suppose that $a$ and $b$ are two consecutive roots of a polynomial function but that $a$ and $b$ are not double roots, so that we can write $f(x) = (x-a)(x-b)g(x)$ where $g(a) \not= 0$ and $g(b) \not= 0$.
Prove that $g(a)$ and $g(b)$ have the same sign. (Remember that $a$ and $b$ are consecutive roots.)

My attempt:

Suppose that $g(a)$ > 0. Since $g$ is continuous at $a$, $\exists\delta > 0$ such that

$$ 0 < |x-a| < \delta \Rightarrow \left|g(x) - g(a)\right| < \frac{g(a)}{2} $$

This can be expanded out to give:

\begin{align*} -\frac{g(a)}{2} &< g(x) - g(a) < \frac{g(a)}{2} \\ \frac{g(a)}{2} &< g(x) < \frac{3}{2} \cdot g(a) \end{align*}

Given a and b are consecutive roots, we can express this as:

$$ |a-b| = |b-a| = 1 $$

Intuitively, we would like $\delta$ such that $|b-a|$ falls within the interval given by $0 < |x-a| < \delta$ to fulfill the above inequality.

Hence, we choose $\delta = 2$. Then,

\begin{align*} 0 < |x-a| < \delta \Rightarrow \left|g(x) - g(a)\right| &< \frac{g(a)}{2} \\ \frac{g(a)}{2} < g(x) < \frac{3}{2} \cdot g(a) \end{align*}

Since $g(a) > 0 \Rightarrow g(x) > 0$, for $x = b$,

$$ |b - a| = 1 < \delta = 2 \Rightarrow \frac{g(a)}{2} < g(b) < \frac{3}{2} \cdot g(a) $$

Therefore, $g(b) > 0$ as well and $g(a)$ and $g(b)$ have the same sign.

Now, suppose $g(a) > 0$. Then, similar to the previous case, given $\frac{g(a)}{2} < g(x) < \frac{3}{2} \cdot g(a)$, we can conclude that $g(x) < 0$. Then, for $x = b$, it follows that $\frac{g(a)}{2} < g(b) < \frac{3}{2} \cdot g(a)$ and $g(b) < 0$ as well. Thus $g(a)$ and $g(b)$ have the same sign. $\square$

What do you all think? Am I missing any important points/making any sweeping assumptions? Please do let me know! Any pointers on how to make this more succinct would be greatly appreciated as well!

1

There are 1 best solutions below

0
On BEST ANSWER

If $g(a)$ and $g(b)$ have different signs, then $g$ must have a zero somewhere between $a$ and $b$, say at $c$. But this would mean that $f(c)=0$ contradicting that $a, b$ were consecutive zeroes of $f.$