Find all $f:\mathbb{R} \to \mathbb{R}$ s.t. $$f\big(x^2+f(y)\big)=(x-y)^2f(x+y)$$
Putting $x=y$ yields $$f\big(x^2+f(x)\big)=0\text.\tag1\label1$$
Putting $y=-x$ yields $$f\big(x^2+f(-x)\big)=(2x)^2f(0)\text.\tag2\label2$$
By \eqref{1}, $(2x)^2f(0)=0$ ($\forall x \in \mathbb{R}$) $$\implies \; f(0)=0$$ Hence putting $y=0$ we have $f\big(x^2+f(0)\big)=f(x)x^2$ which from above is $f\big(x^2\big)=f(x)x^2$.
Using this last equation we have $f(a)=af\left(a^{\frac12}\right)=a^{1+\frac12}f\left(a^{\frac14}\right)=a^{1+\frac12+\frac14}f\left(a^{\frac18}\right)= \cdots =a^2f\left(a^0\right)=a^2f(1)$.
So $f(x)=kx^2$ for some real constant $k$.
Conversely we check and see that it works only for $k=0$ (I think).
My question essentially is - in the sort of limit part is that rigorous (I'd love to avoid explicit calculus) and is my final deduction correct?
Thanks for any help.