This difference equation $$ f(x)^2 = 1 + x f(x + 1) $$ can pop up when looking at a famous problem posted by Ramanujan.
This equation is equivalent to the following infinitely nested radical expression $$ f(x) = \sqrt{1 + x \sqrt{1 + (x+1) \sqrt{1 + (x+2)\sqrt{\cdots}}}} $$ assuming that we take the positive branch of all of the radicals.
Ok, what I would like to prove or disprove is whether the only analytic (or a more relaxed criterion) solution is $f(x) = 1+x$. You can easily enough show that if $f$ is differentiable, then $f(N) = 1+N$ for all integers $N\in\mathbb{Z}$.
You can also show that for all integers $$ f'(N) = a N 2^N + 1 $$ for some $a\in\mathbb{R}$ by solving the recurrence relation that appears after taking the derivative of the above difference equation. It is clear that $f(x) = 1+x$ is a solution, but I am trying to figure out if it is the only solution.
let $g : (0,1] \to \Bbb R$ be some arbitrary function. Define $$f(x) = \begin{cases}1&x = 0\\g(x)&x\in(0,1]\\\dfrac{f^2(x-1) - 1}{x-1} & x\in(1,\infty)\end{cases}$$ Where the last is applied inductively. If desired, the recursion can also be adapted to define $f$ on $(-\infty, 0)$. Because of the recursion, the relation $f^2(x) = 1+xf(x+1)$ holds everywhere for $x$.
Thus you see, there are uncountably many functions satisfying the equation. Even if we require $f$ to be infinitely differentiable, this will still be true. We just need $g$ to be infinitely differentiable and specify a relation between its behavior approaching $0$ and approaching $1$.
It is only analycity - specifically about $x = 1$ - that is a strong enough requirement to actually tame this beast.
The nested square roots expression either converges to a specific function satisfying the recursion formula, or else it diverges. It is not fully equivalent to the recursion formula.