Consider the polynomial equation $x^2 + y^2 = 1$. We know that the solutions to this equation form the unit circle, but for the moment assume that we are only given that $(0,1)$, $(0,-1)$, $(1,0)$, $(-1,0)$ are solutions, and pretend that solving the original equation is intractable for some reason.
Then it seems plausible that one could find new solutions by taking infinitesimal steps from known solutions. For example, we know that on the solution surface we have $2x + 2y \frac{dy}{dx} = 0$. If we consider the solution point $(0,1)$ then we find that at this point $2 \cdot 0 + 2 \cdot 1 \frac{dy}{dx} \bigg \rvert_{(0,1)} = 0 \implies \frac{dy}{dx} \bigg \rvert_{(0,1)} = 0$. This suggests that stepping in the $x$ direction a very small distance should bring one very close to a new solution.
Another way of phrasing this idea would be to consider a perturbed $x$ value from a known solution value $x_s$. Then we have $(x_s + \epsilon_x)^2 + (y_s + \epsilon_y)^2 = 1 \implies 2 x_s \epsilon_x + \epsilon_x^2 + 2 y_s \epsilon_y + \epsilon_y^2 = 0$. Plugging in the solution point $(0,1)$ we get $\epsilon_x^2 + 2 \epsilon_y + \epsilon_y^2 = 0$, which can be solved for $\epsilon_y$ in terms of $\epsilon_x$, yielding new solutions of the form $(0+ \epsilon_x, 1 + \epsilon_y)$.
It seems like these sort of ideas could likely be generalized by talking about (local?) transformation groups that map solution regions into solution regions (at least locally). I'm particularly interested in applying this to systems of polynomial equations where I know a number of special-case solutions.
I've tried Googling around for this sort of idea but have been unsuccessful at finding introductory standard theory. Any recommendations for resources that talk about solution set symmetries of polynomial systems would be appreciated. (I know a bit of analysis / topology / group theory but anything beyond a first course in these topics is a bit over my head, and so resources at an introductory level would work best for me).