If $f(f(x)) = x^2-1$ Find $f(x)$ If there are more than one solutions find the family of functions that satisfies this.
2026-03-29 19:11:15.1774811475
On
Find $f(x)$ if $f(f(x)) = x^2 -1$
240 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
I assume $f$ is a real function. If there is a fixed point $a$ of $f$, it must satisfy $a=a^2-1$, so $a=\frac{1\pm\sqrt{5}}{2}$
Assuming $f$ is differentiable, we can take the derivative.
$2x = \frac{d}{dx} (f(f(x))) = f'(f(x))f'(x)$
Plugging in a fixed point we get
$2a = f'(f(a))f'(a) = f'(a)f'(a) = (f'(a))^2$
$\implies f'(a) = \pm\sqrt{2a} = \pm\sqrt{1\pm\sqrt{5}}$
That way we can linearly approxmate $f$ at possible fixed points $a$:
$f(a+h)= f(a)+f'(a)h = a \pm \sqrt{1\pm\sqrt{5}} h = \sqrt{1\pm\sqrt{5}}(\frac{1}{2} \pm h)$
No reason to think there is an entire holomorphic solution, it is probably proved impossible in the paper by Rice et al. linked by csx.
Thursday: let me emphasize that there are at least three distinct problems here; first is holomorphic or meromorphic entire, rarely possible. Second is real analytic on the whole real line. Rare but possible when the function has no real fixpoints and no critical points, or only one fixpoint. The famous example was H. Kneser for $e^x$ on the whole real line. Finally, often possible, solve on a semi-infinite interval, endpoints either fixpoints of the function or critical points; derivative $0$ is always a problem.
On the other hand, there is a fixpoint of $x^2-1$ at $x = \frac{1 + \sqrt 5}{2}.$ The derivative there is neither $0$ nor of absolute value $1.$ As a result, there is a real analytic solution around the fixpoint by the method of Schröder. This will extend to an analytic solution for $x > 0.$ However, there will be no way to include $x=0.$ The most accessible source for this is D. S. Alexander, A History of Complex Dynamics. Found it, pages 46-47, section 3.5, section title Koenigs' Solution of the Schröder Equation.
THURSDAY: not difficult to indicate how the Schröder business works, to the point where one could program it, draw the graph, and so on. It is necessary to switch from $x^2-1$ to the inverse function $\sqrt {x+1}$ because we need the derivative at the fixpoint to be between $0$ and $1.$
Define $$ f(x) = \sqrt {x+1}, $$ $$ f^{[2]}(x) = f(f(x)), \; \; \; f^{[3]}(x) = f(f(f(x))), \ldots $$ Next define $$ \sigma(x) = \lim_{n \rightarrow \infty} \; \left(\sqrt 5 + 1 \right)^n \; f^{[n]}(x). $$ The proof that this is real analytic for $x > -1$ is due to Koenigs, 1884; in fact it is holomorphic in a sector containing that portion of the real line, on which we are able to use the principal square root. In the third edition of Dynamics in One Complex Variable, John Milnor has this on page 77 and calls it the Koenigs Linearization Theorem. Here we see the reason for using the inverse function, we need an attracting fixed point.
Where was I; our $\sigma(x)$ satisfies the Schröder equation, $$ \sigma(f(x)) = s \sigma(x), $$ where $s$ is the derivative at the fixpoint, in our case $s = 1 / (\sqrt 5 + 1).$
The penultimate step is $$ g(x) = \sigma^{-1} \left( \sqrt s \sigma(x) \right), $$ where $$ g(g(x)) = \sigma^{-1} \left( s \sigma(x) \right) \sigma^{-1} \left( \sigma(f(x)) \right) = f(x) $$
Finally $$ h(x) = g^{-1}(x) $$ and $h(h(x)) = x^2 - 1.$ Here $h$ is defined for $x>0.$