To prove an algebraic number is not a Gaussian prime, need find factors of it. Let us assume, $(a + bi)(x +yi) = 2, \exists a,b,x,y \in \mathbb{Z}$; so $(ax -by) + i(ay + bx) =2$.
As imaginary part is null in $2$, so $ax - by =2, ay + bx =0$.
Now, how to reach from here, the fact that $a=x=1, b=i, y=-i$.
I read in here in a comment by @StevenStadnicki, that an easier way is to multiply both sides of $(a + bi)(x +yi) = 2$ by the complex conjugates, but is not clear, as my below attempt shows. $$ \begin{align} & ax - by =2, ay + bx =0 & \ \\ \implies & axy - by^2 =2y, axy + bx^2 =0 \text{ }{ -(i)} & \ \\ \implies & b = 0 \end{align} $$
By the answer given below, my attempt is stated below:
Two set of values are possible, one in which $(a^2 + b^2) = 4, (x^2 + y^2) = 1$, second in which $(a^2 + b^2) = 2, (x^2 + y^2) = 2$.
Possible combinations for each of the two cases are stated below:
Case (i): $(a^2 + b^2) = 4, (x^2 + y^2) = 1$
$\implies a= \pm 2, b = 0, x = \pm 1, y = 0$, or $a= \pm 2, b = 0, x = 0, y = \pm 1$.
But, as $ax -by =2, ay + bx = 0$.
But being algebraic number, need have non-zero imaginary part, hence no algebraic factors of $2$.
Case (ii) $(a^2 + b^2) = 2, (x^2 + y^2) = 2$
$\implies$ possible values are : $ a= \pm 1, b = \pm 1, x = \pm 1, y = \pm 1$.
But, as : $ax -by =2, ay + bx = 0$, so the valid combinations are:
1. $a=1, b=-1, x=1, y=1$;
2. $a=1, b=1, x=1, y=-1$,
3. $a=-1, b=1, x=-1, y=-1$.
Here, combination 3 is same as 1, so can be ignored also valid.
So, the solution has values of $a =1,b=1, x = 1, y = -1$.
If there is a redundancy or unnecessary steps, please tell.
Let $(a+ib)(c+id)=2$.
There is obviously no solution with real integers, so $b$ and $d$ are both nonzero (if only one is zero, the product is complex or 0, so it also fails). Also, since the product is real, their arguments are opposite: they are multiples of a conjugate pair of gaussian integers.
So it's really $k(a+ib)(a-ib)=2$
Hence
$$k(a^2+b^2)=2$$
So $k$ must be positive. And $k,a,b$ are integers. If $k=2$, it fails because then $b=\pm1$ and $a=0$ (remember $b\ne0$), and the factorization if $2i\times -i=2$ or $-2i\times i=2$, and this proves nothing as you have just multiplied $2$ by a unit.
Hence let $k=1$ and $a^2+b^2=2$, but then $a^2=b^2=1$. But the sign of $a$ is useless, because $(-a+ib)(-a-ib)=(a+ib)(a-ib)$. Hence, you can pick $a=1$, and the solution has to be
$$(1+i)(1-i)=2$$
Which you can check.
Of course, another solution is
$$(-1+i)(-1-i)=2$$