I currently implement the HH algorithm to detect the intersection of a limited ray and a triangle.
My problem is that this implemention does not seems to work.
My understanding of this algorithm:
Triangles are defined with three planes:
$ \overrightarrow{n_{0}} = \overrightarrow{AB} \times \overrightarrow{AC},\ d_{0} = -A \cdot{} \overrightarrow{n_{0}} $
$ \overrightarrow{n_{1}} = \frac{\overrightarrow{AC} \times \overrightarrow{n_{0}}}{|\overrightarrow{n_{0}}|^{2}},\ d_{1} = -A \cdot{} \overrightarrow{n_{1}} $
$ \overrightarrow{n_{2}} = \frac{\overrightarrow{n_{0}} \times \overrightarrow{AB}}{|\overrightarrow{n_{0}}|^{2}},\ d_{2} = -A \cdot{} \overrightarrow{n_{2}} $
The ray has this definition:
$ P(t) = \overrightarrow{o} + t \cdot{} \overrightarrow{d} $
The algorithm uses additional variables:
$ det = \overrightarrow{d} \cdot{} \overrightarrow{n_{0}} $
$ t' = d_{0} - (\overrightarrow{o} \cdot{} \overrightarrow{n_{0}}) $
$ P(t)' = det \cdot{} \overrightarrow{o} + t' \cdot{} \overrightarrow{d} $
$ u' = P(t)' \cdot{} \overrightarrow{n_{1}} + det \cdot{} \overrightarrow{d_{1}} $
$ v' = P(t)' \cdot{} \overrightarrow{n_{2}} + det \cdot{} \overrightarrow{d_{2}} $
$ \begin{pmatrix}t \\ u \\ v \end{pmatrix} = \frac{1}{det} \cdot{} \begin{pmatrix}t' \\ u' \\ v' \end{pmatrix} $
An intersection isdetected at these conditions:
$ sign(t') = sign(det \cdot{} t_{max} - t') $
$ sign(u') = sign(det - u') $
$ sign(v') = sign(det - u' - v') $
Which are equivalent with:
$ 0 \le t \le t_{max} $
$ u \ge 0 $
$ v \ge 0 $
$ (u + v) \le 1 $
My problem: I must misunderstand something
Let's calculate the planes for the following triangle:
$ A(2, 1, 1),\ B(1, 2, 2),\ C(3, 2, 1) $
This results in these values for $ n_{0} $ and $ d_{0} $:
$ n_{0} = \begin{pmatrix}-1 \\ 1 \\ -2\end{pmatrix},\ d_{0} = 3 $
And this is my ray that is aiming to hit point B:
$ \overrightarrow{o} = \begin{pmatrix}1 \\ 4 \\ 2\end{pmatrix},\ \overrightarrow{d} = \begin{pmatrix}0 \\ -1 \\ 0\end{pmatrix},\ t_{max} = 10 $
But these values fail the first condition:
$ det = -1,\ t' = 4 $
$ sign(t') = sign(det \cdot{} t_{max} - t') $
$ sign(4) \not= sign(-14) $
If I calculate all values, all other conditions fail as well. What did I understand wrong?
I think you are making two mistakes. First, if you compute by hand, you'll see that the intersection occur at the barycentric coordinate $(1,0)$ which is exactly at one of the corners of the triangle. Fast ray-triangle intersection algorithms have numerical issues and may give incorrect results in such (literal) edge cases.
Second, you have switched the sign on the $d_0$ parameter. With the triangle $(2.5, 1, 1)$, $(1, 2, 2)$, $(3, 2.1, 1)$ and the ray $(1, 1, 1)$, $(3, 2, 2)$ the calculations work out as follows:
$$ n_0 = (v_1 - v0) \times (v_2 - v_0) = (-1.1, 0.5, -2.15) $$ $$ det = n_0 \cdot d = (-1.1, 0.5, -2.15)\cdot (3, 2, 2) = -6.6 $$ $$ dett = d_0 - o\cdot n_0 = -4.4 - (1,1,1)\cdot(-1.1, 0.5, -2.15) = -1.65 $$ $$ r = det\cdot o + d\cdot dett = -6.6\cdot(1,1,1) + -1.65\cdot(3,2,2) = (-11.55, -9.9, -9.9) $$ $$ x = r\cdot n_1 + det\cdot d_1 = (-11.55, -9.9, -9.9)\cdot(-0.39, 0.18, 0.24) -6.6\cdot0.56 = -3.3 $$ $$ y = r\cdot n_2 + det\cdot d_2 = -0.00000057 $$ $$ det_0 = det - x - y = -3.3 $$
$x$, $y$, and $det_0$ all have the same sign so the ray intersects the triangle.