Continuous solutions of $f(x+y+z)=f(x)g(y)+f(y)g(z)+f(z)g(x)$

358 Views Asked by At

Consider the following functional equation: $$f(x+y+z)=f(x)g(y)+f(y)g(z)+f(z)g(x)$$ where the equation holds for all $x,y,z \in \mathbb{R}$. One solution is $f(x)=cx$ and $g(x)=1$. What are all the continuous solutions $f, g\colon \mathbb{R} \to \mathbb{R}$?

(This was inspired by the question About the addition formula $f(x+y) = f(x)g(y)+f(y)g(x)$)

3

There are 3 best solutions below

1
On BEST ANSWER

The problem has a simpler solution than I had originally anticipated. I don't want to be deleting too much stuff, so instead let me present the simple argument, and leave the rest as it was.

Lemma: Suppose that $f$ satisfies the relation $$ f(x+y+z) = \alpha( f(x)f(y)+f(y)f(z)+f(z)f(x) )+ \beta(f(x) + f(y) + f(z)).$$ where $\alpha,\beta$ are constants, $\alpha \neq 0$. Then $f$ is constant.

Proof: If $f$ is constant, we are done, so suppose that this is not the case. Rewrite the relation as: $$ f(x+y+z) = f(x) ( \alpha f(y)+ \alpha f(z)+ \beta)+ \alpha f(y)f(z) + \beta( f(y) + f(z)).$$ Let $y = w +t$, $z = w-t$, and define $k(w,t) := \alpha f(y)+ \alpha f(z)+ \beta$ and $l(w,t) = \alpha f(y)f(z) + \beta( f(y) + f(z))$. Then the above relation says: $$ f(x+2w) = f(x)k(w,t) + l(w,t).$$ Thus, for any $\xi \in f(\mathbb R)$, and any $t_0$ we have: $$ \xi k(w,t) + l(w,t) = f(x+2w) = \xi k(w,t_0) + l(w,t_0).$$ Because $\xi $ can take at least two different values, we have $ k(w,t) = k(w,t_0)$ and $l(w,t) = l(w,t_0)$. Thus, $k(w,t)$ and $l(w,t)$ really depend only on $w$. Looking at $k$, we conclude that: $$ f(w+t) + f(w-t) = 2f(w),$$ and hence we may write $f(w+t) =f(w) + r_w(t)$ where $r_w$ is a continuous function with $r_w(-t) = -r_w(t)$. Next, looking at $l(w,x)$ we find that: $$(f(w) - r_w(t))(f(w) + r_w(t)) = f(w)^2$$ But after reducing the left side, this leads to $r_w(t)^2 = 0$, and hence $r_w(t) = 0$. Since $r,w$ were arbitrary, we conclude that $f$ is constant, which finishes the proof.

Having this, we can simplify the following argument a lot, and dispose of the additional assumption that $f$ is differentiable.

Original post:

Lemma: We can assume that $g$ is of the form $g(x) = \alpha f(x) + \beta$ for some constants $\alpha, \beta$.

Proof: Plug in $z = y = 0$. Then we get: $$ f(x) = f(x) g(0) + f(0)(g(0) + g(x)).$$ If $f(0) \neq 0$, we can solve the above for $g(x)$ and get the answer of the desired form. So, suppose for a moment that $f(0) = 0$, and consequently also $g(0) = 1$ or $f$ is constantly $0$. Then, we plug in $z = 0$ to the functional equation. This leads to: $$ f(x+y) = f(x)g(y) + f(y) = f(y) g(x) + f(x).$$ Hence we have: $$ f(x)( g(y) - 1) = f(y)(g(x) - 1) $$ If $y$ is any value with $f(y) \neq 0$ we find $g(x)$ of the sought form.

Thus, we have reduced the problem to solving the equation: $$ f(x+y+z) = \alpha( f(x)f(y)+f(y)f(z)+f(z)f(x) )+ \beta(f(x) + f(y) + f(z)).$$ where $\alpha,\beta$ are constants. The case $\alpha = 0$ reduces to Cauchy linear equation. Then, $f$ is affine: $f(x) = ax +b$ for some $a,b$; so suppose this is not the case.

To make things slightly simpler, suppose that $f$ is also differentiable. Then, differentiating with respect to $x$ we get: $$ f'(x+y+z) = \alpha f'(x) (f(y) + f(z) + \gamma) $$ where $\gamma = \beta/\alpha$. Let $\gamma_z = \gamma + f(z)$. Using the symmetry in the above equality, we get: $$\alpha f'(x) (f(y) +\gamma_z) = \alpha f'(y) (f(x) +\gamma_z) $$ which reduces to $$ \frac{f'(x)}{f(x) +\gamma_z} = \frac{f'(y)}{f(y) +\gamma_z}$$ whenever the division is possible. If we fix $z$ and $x_0$ such that $f(x_0) + \gamma_z \neq 0$, then in a neighbourhood of $x_0$, the function $h(x) := \frac{f'(x)}{f(x) +\gamma_z} $ is well defined and constant: $h(x) = c_z$, as long as this neighbourhood (together with the boundary) has no points $x$ with $f(x) = -\gamma_z$. (This can be done, unless $f$ is constant, but that case is trivial.)

Letting $f_z(x) := f(x) +\gamma_z$ we find that: $f_z'(x) = c_z f_z(x) $, and consequently $f(x) = A_z e^{c_z x} - \gamma_z$ for some constant $A_z$ ( $A_z, c_z $ may depend on $x_0$ implicitly). But then $f(x)$ is bounded away from $-\gamma_z$ on the neighbourhood we are considering. Hence, we can always extend the neighbourhood, and consequently we nay assume that $f(x) = A_z e^{c_z x} - \gamma_z$ for all real $x$. Fix some choice of $z$; letting $A=A_z,c=c_z,d = \gamma_z$ we get $$f(x) = A e^{cx} + d$$.

Now, we are in trouble. If we plug in the form for $f$ obtained above to the functional equation we started with, and take $x=y=z$ we get: $$ Ae^{3x} + d = 3 \alpha A e^{2x} + 3 \beta e^x + ...$$ which is a contradiction, because the two sides have a different order of growth.

Thus, it follows that it remains to check the case when $f$ is affine, and $g$ is affine as well. This should is simple enough. Indeed, suppose that $f(x) = ax + b$, $g(x) = cx + d$. If $f$ is constantly $0$, then any $g$ will do, so suppose that it's not the case. If $f$ is constant, but non-zero, then $g(x) =1/3$ for all $x$. Suppose now that $a \neq 0$. Then we get from the functional equation: $$ x+y+z + \dots = ac( xy + yz + zx) + \dots,$$ which is only possible when $c = 0$, so $g$ is constant, $g(x) = d$. But now we get: $$ a(x+y+z) + b = d( ax + ay + az + 3b).$$ Comparing the leading terms, we find that $d = 1$, and hence $b = 0$. Thus, $f(x) = ax,\ g(x) = 1/3$.

To sum up, the solutions are $f(x) = ax,\ g(x) = 1$; $f(x) = b,\ g(x) = 1/3$; $f(x) = 0,\ g(x) = ...$.

5
On

If $f$ is constant then $g(x)=\frac{1}{3}$. Another noted solution is $f(x)=cx$ and $g(x)=1.$ Assuming $f$ and $g$ are thrice differentiable, these are the only solutions.

(Spoiler alert.) The proof just uses differentiation of the functional equation.
\begin{eqnarray*} f(x+y+z)-f(x)g(y)-f(y)g(z)-f(z)g(x)&=&0\\ f'(x+y+z)-f'(x)g(y)-f(z)g'(x)&=&0\\ f''(x+y+z)-f'(z)g'(x)&=&0\\ f'''(x+y+z)&=&0\\ f''(\cdot)&=&c_1\\ f'(z)&=&c_1z+c_2\\ c_1&=&(c_1z+c_2)g'(x)\\ 0&=&c_1g'(x)\\ \end{eqnarray*} Suppose $g$ is not constant. Then \begin{eqnarray*} c_1&=&0\\ f''(\cdot)&=&0\\ f(x)&=&c_2x+c_3\\ c_2&=&c_2g(y)+(c_2z+c_3)g'(x)\\ 0&=&c_2g'(y)\\ c_2&=&0\\ f(x)&=&c_3 \end{eqnarray*} So $f$ is constant. But then $g$ is constant. Therefore, in any case $g$ must be constant. Suppose $f$ is not constant. Then \begin{eqnarray*} g'(x)&=&0\\ c_1&=&0\\ f''(\cdot)&=&0\\ f(x)&=&c_2x+c_3\\ c_2&=&c_2g(y)+(c_2z+c_3)g'(x)\\ g(\cdot)&=&1\\ f(0+0+0)&=&3f(0)\\ f(0)&=&0\\ f(x)=c_2x\\ \end{eqnarray*}

This holds wherever $f$ and $g$ are thrice differentiable. It might be interesting to construct a solution from nowhere differentiable functions.

0
On

Here's a proof equivalent to @Feanor's but expressed a little differently. Even continuity does not seem to be a required assumption.

The solution is understood if $f$ or $g$ are constant, so assume neither is constant, and draw a contradiction.

From $x=y=z=0$, we conclude $f(0)=3f(0)g(0)$, so we have either $f(0)=0$ or $g(0)=\frac{1}{3}$.

Case I: $f(0)=0$

From $y=z=0$ we have $f(x)=f(x)g(0)$, so $g(0)=1$.

From $y=-x,\;z=0$ we have $0=f(0)=f(x)g(-x)+f(-x)$, or $f(x)g(-x)=-f(-x)$.

From $z=-y$ we have \begin{eqnarray*} f(x)&=&f(x)g(y)+f(y)g(-y)+f(-y)g(x)\\ f(x)[1-g(y)]&=&f(-y)[g(x)-1]\\ \frac{f(x)}{g(x)-1}&=&\frac{-f(-y)}{g(y)-1}=d \end{eqnarray*} The constant $d\neq 0$; let $b=1/d$. From this we conclude $g(x)=bf(x)+1$ and $f(x)+f(-x)=0$. (These hold even when $g(x)=1$.)

Finally, from $y=0,\;z=-x$ we have \begin{eqnarray*} 0=f(0)&=&f(x)g(y)+f(y)g(z)+f(z)g(x)\\ &=&f(x)[1-g(x)]\\ &=&-bf(x)^2 \end{eqnarray*} This is our contradiction, since $f$ is not constant.

Case II: $g(0)=\frac{1}{3}$

This is similar to Case I, so I'll spare the algebra. The cases are mutually exclusive: if $f(0)=0$, then we have seen that $g(0)=1$. Denote $c=f(0)\neq 0$.

From $y=z=0$ we have $g(x)=\frac{2}{3c}f(x)-\frac{1}{3}$.

From $z=-y$ we can derive $f(y)+f(-y)=2c$.

Finally, from $y=0,\;z=-x$ we derive $0=(f(x)-c)^2$, our contradiction.