Consider the function $f$ with the following properties:
$$\lim_{x\rightarrow 0} f(x) =1,$$ $$f(x+y)=f(x)\,f(y),$$ $$f(x) >0,\quad \forall x\in\mathbb{R},$$ $$ -\infty<x,y<\infty.$$
Show that if $\,f(1) = p >1,\,$ then $f$ is increasing on $\mathbb{R}$.
How am I supposed to show this without using continuity?
Edit: Asked my professor and I have now figured it out. Here is the proof.
Assume $f$ is continuous in $\mathbb{Q}$. Suppose $f(1) = p >1$. Define $g(x) = f(1)^x$, where $g(x)$ is continuous on $\mathbb{R}$. From an earlier exercise we found that $f(x)=f(1)^x$ if $x\in\mathbb{Q}$. This shows that $f(x)=g(x)$ for all $x\in\mathbb{Q}$. Since $\mathbb{Q}$ is a dense subset of $\mathbb{R}$, we have that $f(x)=g(x)$ for all $x\in\mathbb{R}$. Further, we have that $f$ is continuous on $\mathbb{R}$. Then, take $x_1<x_2$. This gives $f(x_1) = p^{x_1} > 1$ and $f(x_2) = p^{x_2} > 1$. Thus, algebra dictates that $p^{x_1} < p^{x_2}$ since $p >1$. Therefore, given $x_1<x_2$ we have $f(x_1)<f(x_2)$ which proves that $f$ is increasing.
Hints: