Suppose an unknown function $f:\mathbb{R}^2\rightarrow\mathbb{R}$ satisfies
$$f(a,b)=f(a,c)f(c,b),$$
for all $a,b,c\in\mathbb{R}$. Under what conditions can we rigorously conclude that $f$ must be some quotient
$$f(a,b)=\frac{g(a)}{g(b)},$$
for an unknown function $g$? What assumptions do we have to make about $f$, does it need to be analytic, or just continuous?
A physics text I am reading states this as obvious, but I am wondering if there is a way to prove it rigorously.
$f(a,a)=f(a,a)f(a,a)$ so $f(a,a)\in\{0,1\}$
If there exists $a_0$ such that $f(a_0,a_0)=0$ then $$f(x,y)=f(x,a_0)\underbrace{f(a_0,y)}_{\text{develop this}}=f(x,a_0)f(a_0,a_0)f(a_0,y)=0$$ and $f=0$ everywhere, this is not interesting.
So let assume $f(a,a)=1$ for all $a$.
From this we get that $f$ does not annulate since $f(a,a)=f(a,b)f(b,a)=1$
And we get $f(a,b)=\dfrac 1{f(b,a)}$.
So choose for instance $g(x)=f(x,0)$ then $f(a,0)=f(a,b)f(b,0)\iff f(a,b)=\dfrac{g(a)}{g(b)}$
Note: there could be the case $f(a,b)=0$ and $f(b,a)=\infty$, so assuming $f$ defined everywhere seems a sufficient condition.