A couple years ago I remember repeatedly pressing $\sqrt{1+ans}$ into my calculator to be astonished that my calculator gives me an answer approaching the golden ratio. I was astonished, and dug deeper into this problem. I realized that the limit of the sequence given by the recurrence:
$$f(x_n)=x_{n+1}$$
Is, if the limit exists, a solution to:
$$f(x)=x$$
And applying this I realized I could solve almost every algebra problem. So here was one that I came across:
$$x^x=c$$
Easily such equation can be rearranged as:
$$x=\sqrt[x]{c}$$
Where now the problem becomes finding the limit of the sequence:
$$x_{n+1}=\sqrt[x_n]{c}$$
This method worked for some $c$ and $x_1$ like $c=2$ and $x_1=1.5$. But the problem is for some $c$ the sequence seems to alternate according to my observations, like $c=100$ (I tried $x_1=3.5$ and others but it doesn't seem to matter as long as it is positive). I ask anyone if they can help me figure out for what $c$ will this sequence converge for every given $x_1>0$.
What you discovered is called a fixed point iteration.
A fixed point iteration is defined by the sequence \begin{align*}x_{1}&=\text{given}\\x_{n+1}&=f(x_{n})\text{ for }n>0.\end{align*} We say this iteration converges when $x_{n}\rightarrow x$ for some $x$. The Banach fixed point thereom (a.k.a. contraction mapping principle) gives sufficient conditions for when a fixed point exists, is unique, and can be computed by a fixed point iteration (i.e. as the limit of the $x_{n}$). You should read the article on Wikipedia to familiarize yourself with the ideas.
For simplicity, let's now assume $X=\mathbb{R}$ instead of an arbitrary complete metric space, as is usually treated in the statement of the Banach fixed point theorem. One of the consequences of the theorem is as follows: let $Y\subset X$ such that $f$ is continuously differentiable on $Y$, $\sup_{Y}|f^{\prime}|\leq K$ for some $K<1$, and $f(Y)\subset Y$, then $f$ has a unique fixed point on $Y$ which can be computed by a fixed point iteration initialized in $Y$ (i.e. $x_{1}\in Y$).
In your first example, you have $f(x)=\sqrt{1+x}$. Let $Y=[0,\infty)$. Noting that $|f^{\prime}(x)|<1$ on $Y$ (check) and $f(Y)\subset Y$ trivially, it follows that for any $x_{1}\in Y$, $x_{n}\rightarrow x$ where $x$ is the unique fixed point of $f$ on $Y$. In fact, this fixed point must satisfy $x=f(x)=\sqrt{1+x}$, or equivalently, since $x\geq0$, $x^{2}=1+x$. This quadratic equation has two roots: $x=1/2\pm\sqrt{5}/2$. The positive root is, as you pointed out, the golden ratio $1.61803\ldots$
Your other problem, involves $f(x)=\sqrt[x]{c}$. It is not so clear under which conditions this satisfies the Banach fixed point theorem. In fact, as you noticed, you can find examples in which the iteration is nonconvergent.
Do not despair though, there are better ways to compute solutions to your equation. One that comes to mind is Newton's method, which you should take a look at.
In fact, a solution to your problem is given by Lambert's W function as $$x = \frac{\ln c}{W(\ln c)}.$$ Values of the Lambert W are, in fact, often computed by Newton's method, or higher order Newton's methods.
Addendum
Newton's method for $g(x)=x^{x}-c$ is given by \begin{align*}x_1&=\text{given}\\x_{n+1}&=f(x_{n})\text{ for }n>0\end{align*} where $$f(x)=x+\frac{cx^{-x}-1}{1+\ln x}. $$ Since $g^{\prime\prime}(x)\geq0$ for $x>0$ and $f(x)>1/e$ for $x>1/e$ (check), it follows that Newton's method converges whenever $x_{1}>1/e$ and $x=f(x)$ has a solution with $x>1/e$ (see this). This is true at least when $c\geq1$, so that we can conclude:
Here's some MATLAB code: