I found the following statement in a paper I'm reading on (the integers in) a quadratic field:
If $\alpha =a+b\sqrt{D}$ is an irreducible element of $\mathbb{Z}[\sqrt{D}]$, then so is $\bar{\alpha}=a-b\sqrt{D}$
I'm having quite a bit of trouble showing that this is true (and the paper doesn't specify whether it is only true when $\mathbb{Z}[\sqrt{D}]$ is a UFD or not), but here is my attempt for the case when $D<-1$:
Suppose for a contradiction that $\alpha =a+b\sqrt{D}$ irreducible, but that $ \bar{\alpha} =a-b\sqrt{D}$ is not. Consider: $$ a^2-Db^2=(a+b\sqrt{D})(a-b\sqrt{D})$$ Since $a-b\sqrt{D}$ isn't irreducible, it has an irreducible factor $x+y\sqrt{D}$ ([$\ast$] for which we assume that both $x\neq 0$ and $y \neq 0$. This implies that $\gcd_{\mathbb{Z}}(x,y)=1$ when $D<-1$), so $\alpha=(x+y\sqrt{D})\beta$ for some $\beta \in \mathbb{Z}[\sqrt{D}]$ and thus: $$ a^2-Db^2=(a+b\sqrt{D})(x+y\sqrt{D})\beta$$ Dividing this equation throughout by $x+y\sqrt{D}$, we get $$\frac{x(a^2-Db^2)}{x^2-Dy^2}-\frac{y(a^2-Db^2)}{x^2-Dy^2}\sqrt{D}=(a+b\sqrt{D})\beta$$ Now, since the right hand side clearly is an element of $\mathbb{Z}[\sqrt{D}]$, the left hand side must also be. Therefore $(x^2-Dy^2) \;|\; x(a^2-Db^2)$ and $(x^2-Dy^2) \;|\; y(a^2-Db^2)$, but since $(x,y)=1$ by the remark $[\ast]$ it forces that $(x^2-Dy^2)\; |\;(a^2-Db^2)$.
The point is that $x-y\sqrt{D}$ is a factor of $a^2-Db^2$, but since $a+b\sqrt{D}$ and $x+y\sqrt{D}$ are assumed to be irreducible, we must have $x-y\sqrt{D}\; | \; \beta$. Therefore: $$a^2-Db^2=(a+b\sqrt{D})(x+y\sqrt{D})(x-y\sqrt{D})\gamma$$ Where the two first factors are irreducible.
I don't know how to finish the 'proof' attempt, or if it even can be turned into a proof. Surely, we have started some sort of process of factorization that cannot go on indefinately if $\mathbb{Z}[\sqrt{D}]$ is a UFD (since $a^2-Db^2$ has finite norm). So: Does anyone know how to prove the statement and/ or its range of validety?
You are overcomplicating things. The right tool is the following theorem:
Thm. Let $f:A\to B$ be a ring isomorphism. Then $\pi\in A$ is irreducible if and only if $f(\pi)$ is irreducible.
Proof. Clearly, it is enough to prove that if $\pi$ is irreducible , then $f(\pi)$ is irreducible (for the reverse implication replace $f$ by $f^{-1}$).
Since $f$ is an isomorphism, since $\pi$ is nonzero and non invertible (zero is maaped on to zero, and units are mapped onto units), so is $f(\pi)$.
Let $b,b'\in B$ such that $f(\pi)= bb'$. Since $f$ is surjective, we may write $b=f(a),b'=f(a')$ for some $a,a'\in A$.
Then $f(\pi)=f(a)f(a')=f(aa')$, so $\pi=aa'$, by injectivity of $f$. By assumption on $\pi$, $a$ or $a'$ is invertible, say $a$. But then $b=f(a)$ is invertible.
Consequently, $f(\pi)$ is irreducible.
Now apply this to your ring, where $f$ is conjugation. Of course, if you don't like this amount of generality, you can apply the same reasoning to your specific case.