For example, given x(t)=3t−t^3 and y(t)=3t^2, the curve intersects itself at (0, 9) with t=sqrt(3) and t=-sqrt(3). It's pretty easy to see this solution either by graphing and visually seeing the one intersection at (0, 9) and that as t approaches infinity and negative infinity, x(t) gets further away and therefore only one intersection exists.
Another example would be if we were given x(t)=cos(t) and y(t)=sin(t). The graph produced is a circle. There are an uncountably infinite number of (x, y) points where self-intersections exist, and at each point, there are a countably infinite number of t values that intersect at that particular point.
My question is if there exists a general method of finding all self-intersections of all parametric equations of any possible form. This method should find the single coordinate of intersection and corresponding pair of t in the first example, and it should also find the uncountably infinite set of coordinates and countably infinite set of t for each coordinate for the second example.
I'm intentionally not giving a restriction on number of dimensions (I used 2 for the examples, but a solution for 1d or kd would be interesting too) and also no restriction on the domain of t (being able to go from -infinity to infinity would be amazing, but an analysis on just a bounded range would be nice too). It's probably reasonable to assume the curve is continuous and differentiable at all points though.
Intuitively, I feel like a robust general solution probably does not exist because the possible space of equations with finite and infinite solutions is too large. But I am not able to explain my intuition very well other than that.
Another angle to look at the problem is from a computational perspective. Could there perhaps exist some (stochastic?) algorithm that is able to search the (probably bounded) domain of t to find said intersections (while maybe allowing some degree of error)?
Any ideas on how to approach proving/disproving the existence of such a general solution, ideas on areas of mathematics or papers that may provide insights, ideas on how to design an algorithm to find intersections would be very much appreciated.
I know that this question is a bit vague, but I hope I was able to capture the essence of what I'm asking. If there are ways to make my question more specific and precise, please lete me know.
It boils down to trying to solve the following two equations for $t \neq t'$:
Or, equivalently, solving $x(t) - x(t') = y(t) - y(t') = 0$.
In general, this is as easy or as difficult as solving any pair of simultaneous equations. That said, one approach that works (at least in theory) is to consider the function
$$f(t, t') = \left(x(t) - x(t')\right)^2 + \left(y(t) - y(t')\right)^2$$
$f$ is always non-negative, and it only equals 0 if our two equations are both satisfied at the same time. So then the problem of finding the intersection points has become a problem of finding the zeros of $f$, for which there are any number of methods (e.g. multivariate versions of Newton-Raphson) as long as $f$ is sufficiently well behaved.
Of course, there's one glaring problem - we already know that $f$ is zero all the way along the line $t = t'$. Again, depending on how well behaved $f$ is, you might be able to deal with this by instead finding the zeros of
$$\frac{f(t, t')}{(t - t')^n}$$
for some value of $n$, since this should remove some of the bad behaviour.
Generally, though, it will probably be easiest to solve the specific parameteric equation, making use of its particular quirks.