I made this observation in my discrete math course a while back. I explored it further online, so not all the ideas contained are mine alone. I still am confused about some things, though. Consider the equation $x^2-x-1=0$ whose positive root is the golden ratio $\phi$. Rewriting the equation we get $x^2=x+1$, and for nonzero $x$ we can divide to get $x=1+\frac1x$. Now $x$ is defined recursively in terms of itself, so plugging it back into this defining equation we get $x=1+\frac1{1+\frac1x}$, and by continuing indefinitely we produce a continued fraction which converges exactly to $\phi$. Returning to the quadratic, we see also that if $x^2=x+1$, then by taking the square root we have $x=\sqrt{x+1}$; since again $x$ is defined recursively in terms of itself we can plug it back into the equation to get $\sqrt{\sqrt{x+1}+1}$ and so on, to produce an infinitely iterated root that also converges to $\phi$.
What is this process/idea called where you find a recursion that 'solves itself'? By that I mean that we had an expression for $x$ in terms of itself, which can be used iteratively to compute $x$. How can one recognize that iterated roots or fractions can be rewritten as polynomials and vice versa?
Also, I find this argument of 'plugging $x$ into itself' to be very ad-hoc and non-rigorous. Is there a more rigorous framework in which one could view this kind of thing? And is there a way to tweak this idea so that a recursion for the negative root of the quadratic can be found?
You are discovering fixed point iteration, which can be a powerful technique. You arrange your equation in the form $x=f(x)$, then start with some $x_0$ that you hope is close enough to the solution and iterate $x_{n+1}=f(x_n)$ to convergence. If $f(x)$ is differentiable at the limit $L$ and $|f'(L)| \lt 1$ then it will converge in some neighborhood of $L$. Intuitively, the distance from the limit is multiplied by something close to $f'(L)$ each step. There are two parts of cleverness-you need to find a way of arranging your equation so $|f'(L)| \lt 1$ and you need to find a starting value that is close enough. If this process converges, you have $f(L)=L$. Many times this expression lets us find the limit, which is the opposite direction from your approach. For $x^2-x-1=0$ the quadratic formula gets us there, but suppose we wanted to solve $x^{10}+x-1=0$. We could think that if $x$ is a little smaller than $1$, then $x^{10}$ will be a lot smaller than $1$, so the solution is close to $x=1$ but a little smaller. So I will write the recursion as $x_{n+1}=\sqrt[10]{1-x_n}$ and start with $x_0=0.99$. It converges quickly to $x\approx 0.835079069$. A spreadsheet makes this easy-copy down is your friend.