Suppose we have a matrix $H$ and I want to find out whether there exists a vector $r\ne 0$ such that $H^{ij}r_i r_j=0$. To this end we could simply compute if the determinant $\det(H)$ is zero, which is computationally inexpensive and explicit (we don't have to find such $r$ explicitly).
Is there an analogue of such a test to see if there exists a vector $r\ne0$ such that $G^{ijk}r_i r_j r_k=0$ for a $3$-tensor $G$?
I didn't specify the number of dimensions of the vector space, but if it's relevant $r\in \mathbb R^3$.
This question could be related to Determinant of a tensor.
There's always such a vector. To see this, let $$ f(r) = G^{ijk}r_i r_j r_k, \qquad r \neq 0 . $$ If $f$ is identically zero, any $r \neq 0$ will do. Otherwise, $f$ assumes both positive and negative values, since $f(-r)=-f(r)$. Then, by continuity of $f$ and connectedness of the space $\mathbb{R}^3\setminus \{ 0 \}$, we see that $f$ must also take the value zero for some $r \neq 0$.