On the equation $f(x^2+y+f(y))=2y+f(x)^2.$

47 Views Asked by At

If $f:\mathbb{R}\to\mathbb{R}$ satisfies $f(x^2+y+f(y))=2y+f(x)^2,$ prove that $f(x)=x$ for all $x.$

Here's my work so far: $x=0$ gives that $f$ is surjective. Let $f(c)=0.$ Then $x=y=c$ gives $f(c^2+c)=2c,$ so $f(f(c^2+c)/2)=0.$ From $y=0,$ we have $f(x^2+f(0))=f(x)^2,$ so $f$ is non-negative for $x≥f(0).$ Let $y=-x^2$ to obtain $f(f(-x^2))=f(x)^2-2x^2;$ not sure what to do with this observation just yet.

Let $f(y)=-x^2$ to obtain $2y+f(x)^2=-x^2,$ which becomes $f(-0.5(x^2+f(x)^2))=-x^2.$ From $x=y=0,$ we have $f(f(0))=f(0)^2.$ Now we get $f(-0.5f(f(0))=0.$ Put $y=0$ to get $f(x^2+f(0))=f(x)^2.$ Now the FE becomes $f(x^2+y+f(y))=2y+f(x^2+f(0))$ and we have rid ourselves of that pesky "square of a function" term.

Some plans of attack:

  1. Prove that $f(0)=0.$

  2. Prove that $f$ is injective.

  3. Prove that $f$ is increasing.

Any ideas on how to accomplish any of these? Any ideas on alternative approaches? What's next? Am I close to solving the problem?