If $f(x)=x^2+ax+b$ has integer roots, and $f(x+\frac1x)=f(x)+f(\frac1x)$, then find $a^2+b^2$

92 Views Asked by At

Let $f(x) = x^2 + ax + b$. If for all non-zero real $x$ $$f\left(x + \frac1x\right) = f(x) + f\left(\frac1x\right)$$ and the roots of $f(x) = 0$ are integers, what is the value of $a^2 + b^2$?

Here is the photo of the question if you are finding difficult to look at it here

EDIT: The answer must be in two-digits

EDIT: What I have done so far is that I added temporary values to the equation $$f(x) = 0 = x^2 + ax + b$$ and tried to solve it in the form of $a + b$ to which then I squared and solved and got, $$a^2 + b^2 = x^2 - 2ab$$

I don't really know what I am doing is correct, so help me out please!

3

There are 3 best solutions below

2
On

Hints:

  1. Try say $x=1$ with the condition given, to immediately get $b$.
  2. Use the fact that roots are integers and factors of $b$ to find possible values of $a$.

Even though there is more than one possible value for $a$, you should find a unique value for $a^2+b^2$.

0
On

This is a completely routine matter of writing out the given ingredients; you have $$f(x)=x^2+ax+b\qquad\text{ and }\qquad f(x+\tfrac1x)=f(x)+f(\tfrac1x),$$ for all $x$, so simply write out what $f(x+\tfrac1x)$ and $f(\tfrac1x)$ are. We have \begin{eqnarray*} f(x+\tfrac1x)&=&(x+\tfrac1x)^2+a(x+\tfrac1x)+b\\ &=&(x^2+2+(\tfrac1x)^2)+ax+a\tfrac1x+b\\ &=&(x^2+ax+b)+((\tfrac1x)^2+a\tfrac1x+b)-b+2\\ &=&f(x)+f(\tfrac1x)-b+2, \end{eqnarray*} and so $f(x+\tfrac1x)=f(x)+f(\tfrac1x)$ implies that $b=2$. The roots of $f(x)=0$ are integers, so $$0=f(x)=x^2+ax+2,$$ has integer roots. These must then be divisors of $2$, i.e. at least one of $1$, $2$, $-1$ and $-2$ is a root of $f(x)=0$. These options correspond to four linear equations in $a$: \begin{eqnarray*} 0=f(1)&=&1^2+a\cdot1+2=3+a,\\ 0=f(2)&=&2^2+a\cdot2+2=6+2a,\\ 0=f(-1)&=&(-1)^2+a\cdot(-1)+2=3-a,\\ 0=f(-2)&=&(-2)^2+a\cdot(-2)+2=6-2a,\\ \end{eqnarray*} and so either $a=3$ or $a=-3$, hence $a^2+b^2=13$.

0
On

As described in Servaes' answer, deduce that $b=2$ by comparing coefficients and also that the roots belong to the set $\{\pm 1, \pm 2 \}$ by the use of Rational Root Theorem.

At this point you don't have to solve any equations. You can use Vieta's formulas on the sums and products of roots to deduce that the product of the roots must equal $b=2$, so they must be of the same sign, that is either of the pairs $1,2$ or $-1,-2$. Then deduce the sum of the roots must be equal to $a$, which gives $a = \pm 3$. Then $a^2 + b^2 = 13$, as before.