I'm trying to figure out when $(a+bi)^n$ is an integer, for integers $a,b,$ by putting certain conditions on $a$ and $b.$ I know that $b=0$ certainly works, but I'm not sure what other conditions there are for all integers $n.$ I tried using the binomial expansion in general, but any condition on $a$ and $b$ would only be implicit. I also tried thinking of some way to use induction on $n$ and trying to find when $(a+bi)^n(a+bi)$ is an integer, but I couldn't get anywhere with that.
What are the conditions on $a$ and $b$ for when $(a+bi)^n$ is an integer for some integer $n$?
We should first switch to the polar representation of complex numbers: $$a+bi=re^{i\theta}$$ Here, $r=\sqrt{a^2+b^2}$ and $\theta=\tan^{-1}\frac ba$. (There are some particularities about how $\theta$ is defined, but I want to be brief.) This gives us that: $$(a+bi)^n=r^ne^{in\theta}$$ If this quantity is an integer, then $n\theta$ must be a multiple of $\pi$, i.e. $$\theta=\frac{k\pi}{n}$$ for some $k\in\mathbb Z$. Substitute our original expression for $\theta$ to get this relationship: $$b=a\tan\left(\frac{k\pi}{n}\right)$$ for some $k\in\mathbb Z$. We can actually restrict $0\leq k<2n$ so the argument of $\tan$ is in $[0, 2\pi)$.
I think this is the strongest condition we can get on $a$ and $b$, for there are $n$ roots to $z^n=1$ and $n$ roots to $z^n=-1$. Any condition to your problem must account for all $2n$ "solutions".