Factoring $a + b\frac{1 + \sqrt{5}}{2}$ if $a^2 + ab - b^2 = a+b$

75 Views Asked by At

Let $\phi = \frac{1 + \sqrt{5}}{2}$, I was asked to show that if $a^2 + ab - b^2 = a+b$, then $a + b \phi$ can be written as $\pm\phi^mx$ or $\pm\phi^m\sqrt{5}x$ for integers $x,m$. I already have shown that $R = \mathbb{Z} + \phi\mathbb{Z}$ is a Euclidean domain, that any integer is a power of $\phi$ and that if $p$ is a prime integer, then it is either prime in $R$ or can be written as $(a + b\phi)(a+b-b\phi)$. These elements are not associated unless $p = 5$. Then I have shown that $a + b\phi$ and $a + b - b\phi$ are associated if $a^2 + ab - b^2 = a+b$.

Now suppose that $a^2+ab - b^2 = a+b$, then we have that $$a + b = a^2 + ab - b^2 = (a+b\phi)(a+b - b\phi) = \pm\phi^m(a+b\phi)^2$$ because of the previous work. However, I have no idea on how to proceed. Any hints?

edit: Since $R$ is a UFD, and we can consider $a + b\phi$ to be different from a unit, it has a factorization. Each of these factors has to be a factor of $a+b$, making $a+b$ a square in $R$. From here, I am not quite sure how to proceed.