difference in rationality of $a+b$ and $a^n + b^n$

112 Views Asked by At

Does there exist real numbers a and b such that

(i) $a+b$ is rational and $a^ n +b^ n$ is irrational for each natural $n ≥ 2$;

(ii) $a+b$ is irrational and $a^ n +b^ n$ is rational for each natural $n ≥ 2$?

for (i), I tried to prove yes, and I was thinking of some rational $x$ and irrational $z$ such that $a = x+z, b = x-z$, but I don't quite know how to show $a^n + b^n$ is always irrational for $n \geq 2.$ I tried to use induction, but since you can't say irrational + irrational = irrational, I'm at a loss as to what to do.

for (ii), I tried to prove no by factorizing some $a^n + b^n$ for some odd $n$, say $a^3 + b^3 =(a+b)(a^2 -ab + b^2)$, and somehow proving that $\frac{a^n + b^n}{a+b}$ is rational for some odd $n$, but I don't know what to do next.

2

There are 2 best solutions below

0
On

Study case (i)

Let $s=a+b$ and $p=ab$ then $a,b$ are solutions of $x^2-sx+p=0$.

Regarding this as the characteristic equation of a linear induction relation this gives $$\begin{cases}u_{n+2}=su_{n+1}-pu_n\\u_n=a^n+b^n\end{cases}$$

Note that $u_0=2$ and $u_1=s$ therefore if both $s$ and $p$ are rational then by induction every $u_n$ will be rational.

Therefore the only chance at a counter example is to have $p$ irrational.

This is only a necessary condition.


Edit 1:

We can try to prove it for some particular set of numbers.

Let's work in the ring $\mathbb Q\Big[\sqrt{k}\Big]$ for instance.

I claim it is working for any integer $k\ge 2$ for $\begin{cases}a=k+\sqrt{k}\\b=1-\sqrt{k}\end{cases}$

$$\require{cancel}\begin{cases}s=a+b=k+1\in\mathbb Q\\ p=ab=(k+\sqrt{k})(1-\sqrt{k})=\cancel{k}-k\sqrt{k}+\sqrt{k}-\cancel{k}=(1-k)\sqrt{k}\in\mathbb R\setminus\mathbb Q\end{cases}$$

Now since $u_n=a^n+b^n\in\mathbb Q\Big[\sqrt{k}\Big]$ too we can set:

$$u_n=\alpha_n+\beta_n\sqrt{k}$$

The linear induction relation gives (I skip the calculations):

$$\begin{cases} \alpha_0=2,\ \beta_0=0\\ \alpha_1=s,\ \beta_1=0\\ \alpha_{n+2}=s\,\alpha_{n+1}+k(k-1)\,\beta_{n}\\ \beta_{n+2}=s\,\beta_{n+1}+(k-1)\,\alpha_{n} \end{cases}$$

Since $s>1$ and $(k-1)>0$ and all initial terms are non-negative, we get $\alpha_n\nearrow$ and $\beta_n\nearrow$.

In particular $$\beta_n>0\implies u_n\in\mathbb R\setminus\mathbb Q$$


Study case (ii)

We still have the relation $$p=\frac {s\,u_{n+1}-u_{n+2}}{u_n}$$

Notice that since $s$ irrational and for $n\ge 2$ all $u_n$ are rational by hypothesis this forces $p$ to be irrational too.

Though I suspect this case is not possible, I can't seem to find the decisive blow...

Edit 2:

See AAA's answer, by exploiting ${u_n}^2$ you can prove that $p$ has to be rational too, which is an incompatible conclusion, therefore (ii) is not possible.

1
On

For ii, I claim that if $a^n+b^n$ is rational for all $n\geq 2$, then $a+b$ is rational.

Proof: Let $f_n=a^n+b^n$. Then $f_n^2-f_{2n}=2(ab)^n$. So we conclude that $2(ab)^n$ is rational for all $n\geq 2$. But then we divide $2(ab)^3$ by $2(ab)^2$ to conclude that $ab$ is rational. Then we conclude that $f_2-ab=a^2-ab+b^2$ is rational. Finally, since $f_3=(a^2-ab+b^2)f_1$, we conclude that $f_1$ is rational.