I'm currently using the following function to check whether or not a point lies within a triangle:
bool calc_barycentric(const point& A, const point& B, const point& C, const point& P)
{
point v0 = B-A, v1 = C-A, v2 = P-A;
double d00 = v0*v0;
double d01 = v0*v1;
double d11 = v1*v1;
double d20 = v2*v0;
double d21 = v2*v1;
double denom = d00*d11 - d01*d01;
// compute parametric coordinates
double v = (d11 * d20 - d01 * d21) / denom;
double w = (d00 * d21 - d01 * d20) / denom;
return v >= 0. && w >= 0. && v + w <= 1.;
}
The point
class has an X
and Y
location defined, and the *
operator computes the dot product.
This function works fine for most of my use cases where the X
and Y
coordinates of A
, B
, and C
are greater than 0, but it fails when any of the X
or Y
coordinates are less than 0.
I've tried a few different variations of performing this calculation, including finding alpha
, beta
, and gamma
and checking if those are between 0 and 1, but that also failed.
How could I modify this function to work with negative coordinates? Or is that not possible when using barycentric coordinates in this way?
This looks like an XY problem to me. You're asking about where you think the problem lies, but there's likely some other problem in your code or in your tests.
Assuming that
point
is appropriately defined, this function should work regardless of whether the coordinates are negative. You can see this by subtracting the minimal coordinate from all coordinates. Then the coordinates are all non-negative, and the differences computed at the beginning of the function are the same.Perhaps more could be said if you show us your test cases (how you called the function, with which negative coordinates, and what the result was).