Consider a general surface and a line in $\mathbb{R}^3$. Given equations for both the surface and line, is there a way to analytically determine the number of times the line intersects the surface?
I am only interested in the number of intersections, not the precise locations of intersection. I know that there are methods to calculate this numerically by first finding where the intersections are, but I want to know if there is a simple method to just calculate the number of intersections.
The surface might be given implicitly as $F(x, y, z)=0$ or parametrically as $\vec{r}(u, v)=x(u, v) \hat{\imath}+y(u, v)\hat{\jmath}+z(u, v)\hat{k}$, and may or may not be closed. The line might be given as $\vec{r}(t)=\vec{r}_0+t \vec{d}$ where the line passes through the point $\vec{r}_0$ and $\vec{d}$ is the direction vector. The line extends infinitely in both directions, but I can also work with the case in which the line has finite endpoints.
EDIT: As pointed out in comments below, I am specifically working with the surface given by an implicit equation of the form
$$ F(x, y, z)=x^k+y^k+z^k-1 $$
where the terms involving $x$, $y$, and $z$ can have constant coefficients. Specifically, I am interested in the region $x$, $y \geq 0$. $k$ can be a general rational exponent greater than $2$. One solution suggested below involving the Sturm sequence works simply enough for integer $k$, where one can write a polynomial involving just the parametric variable by substituting in the equation of the line.
This surface is part of a shape called a superellipsoid, but just taken for $x$, $y \geq 0$ to simplify $F(x, y, z)$. In my application, I can make some geometric and symmetry arguments so that I can only consider this portion.
Substituting a parametrization of the line into an implicit equation for the surface, the problem becomes one of counting the number of zeros of a function. If the function is rational, you can do this using Sturm's theorem. For more general classes of functions, the problem can be unsolvable: see Richardson's theorem. For example, Laczkovich showed there is no algorithm that, given a function $A(x)$ in the ring generated by integers, $x$, $\sin(x^n)$ and $\sin(\sin(x^n))$, decides whether there is a real solution of $A(x)=0$.
EDIT: If your function involves fractional exponents, it should still be possible to convert the equation to one involving a polynomial. For example, consider the equation $$ \sqrt{x^2+1} - (x^4+1)^{1/4} - x + 2 = 0 $$ Write this as $f(\alpha,\beta,x) = \alpha - \beta - x + 2 = 0$ where $\alpha^2 - (x^2+1) = 0$ and $\beta^4 - (x^4+1) = 0$. First take the resultant of $f(\alpha,\beta,x)$ and $\alpha^2 - (x^2+1)$ with respect to $\alpha$, then the resultant of that and $\beta^4 - (x^4+1)$ with respect to $\beta$, and we get the polynomial $-15 x^8 + 64 x^7 - 112 x^6 + 112 x^5 + 160 x^4 - 704 x^3 + 752 x^2 - 320 x$. Sturm tells us this has four real roots. However, we must then look at each root (approximately) to check whether the corresponding $\alpha$ and $\beta$ are the correct (i.e. positive) square root of $x^2+1$ and fourth root of $x^4+1$: it turns out that only one of the four roots works.