Is there a deeper reason as to why while $a^2 - b^2$ (and other powers > 2) can be factorized over the reals, $a^2 + b^2$ cannot (though for powers > 3 it can)?
I think this might somehow be connected to the fact that the distance metric in euclidean space involves $a^2 + b^2$
There is also the trivial explanation that being able to factor expressions of the form $a^2 + b^2$ would imply real solution to $x^2 + k^2 = 0$, but that doesn't seem satisfying enough
Hints: assuming WLOG $b \ne 0\,$:
$a^n \pm b^n = b^n\left(\left(\cfrac{a}{b}\right)^n \pm 1\right)\,$ so $a^n \pm b^n$ factors iff $c^n \pm 1$ factors (where $c=a/b$).
$c^2-1=0$ has the two real roots $\,c=\pm1\,$ giving the factorization $c^2-1=(c-1)(c+1)\,$.
$c^2+1=0$ has no real roots, so it doesn't factor over $\mathbb{R}\,$.
The latter also implies that $c^{2k}+1$ doesn't factor over the reals for any even power $2k$.
[ EDIT ] To elaborate on what I first posted as a comment, addressing this part of OP's question:
That's not entirely
trivial, and I don't see why it wouldn't besatisfying enough. In fact, the key reason why $x^2+1$ does not factor in $\,\mathbb{R}[x]\,$ is because the real quadratic $\,x^2+1\,$ has no real roots, in other words $\,-1\,$ is not a square in $\,\mathbb{R}\,$.Consider that factorizations do actually exist in other rings where $\,-1\,$ is a square , for example:
$x^2+1=(x-i)(x+i)$ in $\,\mathbb{C}[x]\,$ since $\,-1=i^2\,$ in $\,\mathbb{C}\,$
$x^2+1=(x+1)^2$ in $\,\mathbb{Z_2}[x]\,$ since $\,-1=1=1^2\,$ in $\,\mathbb{Z_2}\,$
$x^2+1=(x+2)(x+3)$ in $\,\mathbb{Z_5}[x]\,$ since $\,-1=2^2\,$ in $\,\mathbb{Z_5}\,$
Point is that $\,x^2+1\,$ does not factor over reals precisely because the equation $\,x^2+1=0\,$ has no real roots. That's not a random happenstance, it is the essential reason why (not).