Let $n\in \mathbb{N}, n > 1$. Show that
$$\{a^2+a-1,a^3+a^2-1,...\}$$
contains an infinite subset $S$ s. t. every $2$ distinct elements are coprimes.
I don't know how to even approach this.. can you give me some places to start? Attempt:
$a^3+a^2-1=a(a^2+a-1)+a-1$. Using Euclid algorithms we determine that:
$a^2+a-1=(a+2)(a-1)+1\implies (a^3+a^2-1,a^2+a-1)=1$ so they are relatively prime.
I was thinking of something like this:
Let $p,q \in\mathbb{N}, p > q > 1$ then some random elements of that set should be: $a^p+a^{p-1}-1, a^q+a^{q-1}-1$. Then we need to show that:
$$(a^p+a^{p-1}-1,a^q+a^{q-1}-1)=1$$
Right?
This is not an answer, as John Omielen commented below. It just responded Peter's comment in the question by proving that as polynomials in $a$, the elements in the set are pairwise coprime. I do not see a direct proof of the original question from this fact.
By the Euclid algorithm, for any pair of natural numbers $p > q$,
$$ \begin{aligned} \gcd(a^{p+1} + a^p - 1, a^{q+1} + a^q - 1) &= \gcd(a^{p+1} + a^p - 1 - a^{p-q}(a^{q+1} + a^q - 1)), a^{q+1} + a^q - 1)\\ &= \gcd(a^{p-q}-1, a^{q+1} + a^q - 1). \end{aligned} $$
Taking them as polynomials in $a$, if the g.c.d. is not 1, then there is an element $\xi \in \mathbb C$ such that
$$ \begin{aligned} \xi^{p-q} &= 1;\\ \xi^{q+1} + \xi ^q &= 1. \end{aligned} $$
The first equation tells that $\xi$ is a root of unity, and the second equation tells us that $\xi^{q+1}$ and $\xi$ lies on the unit circle on the complex plane, and the angle between them and the origin is $120^\circ$. The only chance for this to be true is $\xi$ be a primitive root of unity. But in this case $\xi^{q+1} + \xi^q = -\xi^{q-1} \neq 1$.
Therefore, $\gcd(a^{p+1} + a^p - 1, a^{q+1} + a^q - 1) = 1$ for all $p \neq q$.