I am developing an algorithm that, given a set of surfaces in 3D and a topology description, describes an object. The edges of the surface are found by tracing the intersection of two surfaces and is stored in a list of points. For a surface to be considered ok, a few conditions has to be met, in particular, the origin of the local coordinate system should be on the surface (c.f. image below). Sometimes, the surface is traced wrongly, and the origin ends up outside the surface. Is there some way to detect this? Had it been in 2D, I would have used winding number, but I cannot find a generalized version of winding number to handle a curve on a surface.
The functions describing the surfaces are implicit, i.e. I do not have the equation, but I can call a function $f(x,y,z)$ which gives me the signed distance from $x,y,z$ to the surface. Hence, I need to do this computationally.
So, to clarify, the questions should probably be stated as: Given a polygon $P$ on a surface described by the function $f$, can I determine if a point $a$ (also on the surface) is contained within polygon $P$?
The code I am working on is large and well established, hence, changing the way the geometry is described or the way the coordinate systems work is not really possible.

I think it's an ill-defined problem. Couple of pictures to think about
However, there are couple of things to consider if your surface are “good enough”:
Start going along from the point $a$ along the intersection $f=0$ and $x=0$. When you arrive at the polygon boundary, check that you arrived from the right side. (you may select a different plane $y=0$ or take the plane passing through $a$ and two opposite points of the boundary)
If you know that your surface doesn't have quirks smaller than $d$, then you can use marching cubes to build a mesh and then check whether you point belongs to a certain face of the mesh.