I have a proof of this statement that I came up with, I want to see if there is anything wrong with it.
Let $d=\gcd(a,b)=\gcd(a^2 , b^2)$. Then we have that $d\mid a$ $\land$ $d\mid b$ $\implies$ $a=d\cdot k$ $\land$ $b=d\cdot m$ for some integers $m,k \in \mathbb{Z}$. Thus we also know that $a^2 = d\cdot ka$ $\land$ $b^2 =d \cdot bm$ . From the fact that $d=\gcd(a^2,b^2)$, we have that $a^2 x +b^2 y = d$, and thus substituting in our expressions for $a^2$ and $b^2$, we get $d\cdot kax +d\cdot bmy = d$, and factoring out a $d$, we get that $kax+bmy=1$. Since $m,k \in \mathbb{Z}$, then we know that $xk, my \in \mathbb{Z}$. Which implies necessarily that $gcd(a,b)=1$.
Let me know if there's anything wrong with this proof.
We know that $d=\gcd(a^2,b^2)=\gcd(a,b)^2$. Now it is given that $\gcd(a,b)=\gcd(a^2,b^2)$. Thus $d=d^2\implies d=0,1$
Since $d\neq 0\implies d=1$.