Trying to prove surjectivity

93 Views Asked by At

We are given $f$ : $R$ $\rightarrow$ $R$ such that

$f(f(x)f(y)) = f(x) + f(y)$

for all reals $x$ and $y$

To show that the function is surjective, we plug in $y = 0$ and let $z = f(x)$ and $c = f(0)$, then we get $f(cz) = z + c$. Does this imply that function is surjective or maybe I'm mistaken.

2

There are 2 best solutions below

0
On

Take the constant function $f=0$. It satisfies the hypothesis and it is not surjective.

2
On

The only solution is $f=0$ By $p(x,y)$ we have $f(f(x)^2)=2f(x)$ Now by $p(f(x)^2,y)$ we have $f(2f(x)f(y))=f(y)+f(x)^2$ It's obvious that by changing $x,y$ to $y,x$, $f(2f(x)f(y))$ won't change. So $f(y)+f(x)^2$ =$f(x)+f(y)^2$ which is evaluate to $(f(x)-f(y))(f(x)+f(y)-1)=0$ so for each $x,y$ we have $f(x)=f(y)$ or $f(x)=1-f(y)$ which tells us that the range of f has atmost $2$ elements. If range of f has only $1$ element let it be a so we have $2a=a$ so $f$ is constant $0$ If range of f has $2$ elements let them be $a,b$ now set $p(x,y)$ in such a way that $f(x)=a,f(y)=b$ so $f(ab)=a+b$ so $a+b$ is also in range of $f$ so $a+b$ is $a$ or $b$ so at least one of $a,b$ is zero . Now set $p(x,y)$ in such a way that $f(x)=f(y)=a$ where $a$ is the nonzero element of range of $f$ so $f(a^2)=2a$ so $2a=a$ or $0$ both of them say $a=0$ which is contradiction so the only answer is $f=0$