It is probably a simple problem, but I'm having trouble proving the following: if $f\colon \mathbb{R}\to \mathbb{R}$ is a function such that, for all $x,y\in \mathbb{R}$ we have $$ f(x+y)=f(x)\cdot f(y),$$ then either $f\equiv 0$ or $f(x)>0$ for all $x\in \mathbb{R}$.
If there is $x_{0}\in \mathbb{R}$ such that $f(x_{0})=0$, then $f(x)=f(x-x_{0})\cdot f(x_{0})=0$, for all $x\in \mathbb{R}$. So, what I need to prove is that, if $f(x)\neq 0$ for all $x\in \mathbb{R}$ (i.e., $f$ is a group homomorphism from $(\mathbb{R},+)$ to $(\mathbb{R}^{\ast},\cdot)$), then there is no $x\in \mathbb{R}$ such that $f(x)<0$.
Surely, this property is aimed to mimic a general property of exponential functions $a^{x}$ for $a>1$. In fact, I can prove the claim whenever I add more hypotheses, such as continuity or monotonicity.
Thanks in advance for any help!
Assume that $\exists x \in \mathbb{R}$ such that $f(x)<0$. Then, note that
$f\left(\frac{x}{2} + \frac{x}{2}\right) = f(x) = f^2\left(\frac{x}{2}\right) < 0$, a contradiction.