Let $A$ be an arbitrary $3\times3$ complex matrix. We want to find the set $S:=\{\vec{v}\in\mathbb C^3: \det(A^2\vec{v},A\vec{v},\vec{v})=0\}$. Put another way, $S$ is the set of all $\vec{v}\in\mathbb C^3$ for which there exists a quadratic form $Q$ such that $Q(A)\vec{v}=\vec{0}$.
If $A$ is diagonalizable, finding $S$ is relatively straightforward, since any $\vec{v}\in\mathbb C^3$ can be expanded in the eigenvectors of $A$: $$\vec{v}=\sum_i c_i\vec{w}_i\quad\text{where}\quad A\vec{w}_i=\lambda_i\vec{w}_i$$ Then: $$Q(A)\vec{v}= \sum_i c_iQ(\lambda_i)\vec{w}_i$$ and since the eigenvectors are independent, this vanishes iff $$\forall i: c_i=0\vee Q(\lambda_i)=0$$ Now we can choose the roots of $Q$ to correspond to eigenvalues. Thus, if $A$ has a repeated root, $S=\mathbb C^3$, whereas if $A$ has no repeated roots $S=\bigcup_i\mathrm{Span}\{\vec{w}_j: j\neq i\}$.
Since repeated roots is a necessary condition for a matrix to be non-diagonalizable, it would be neat if $S=\mathbb C^3$ for all non-diagonalizable matrices. However, since the eigenvectors are no longer independent, the above proof breaks down in two points. Does anyone have suggestions as to how to tackle this more general case?
You can do something similar when the matrix is not diagonalizable. Consider $A \in \mathbb{C}^{n\times n}$ and $S = \{v \in \mathbb{C}^n : \exists Q \in \mathbb{C}[x], \deg Q \le n-1 \text{ such that } Q(A)v = 0\}$.
Let $m_A$ be the minimal polynomial of $A$. If $\deg m_A \le n-1$ then the set $\{I, A, \ldots, A^{n-1}\}$ is linearly dependent so $S = \mathbb{C}^n$. $\DeclareMathOperator{pl}{\dot+\,}$
If $\deg m_A = n$, then let $m_A(x) = (x-\lambda_1)^{p_1}\cdots (x-\lambda_s)^{p_s}$ and we have the generalized eigenspace decomposition
$$\mathbb{C}^n = \ker(A-\lambda_1I)^{p_1}\pl \cdots \pl (A-\lambda_sI)^{p_s}$$
with $p_1+\cdots+p_n = n$.
We claim that $$S = \mathbb{C}^n \setminus \left\{\sum_{i=1}^s c_iw_i : c_i \ne 0, w_i \in \ker (A - \lambda_i I)^{p_i} \setminus \ker (A - \lambda_i I)^{p_i-1}, 1 \le i \le s\right\}$$
Indeed, if $v = \sum_{i=1}^s c_iw_i$ such that $c_j = 0$ or $w_j \in \ker (A - \lambda_j I)^{p_j-1}$ then the polynomial $$Q(x) = \frac{m_A(x)}{x-\lambda_i}=(x-\lambda_1)^{p_1}\cdots(x-\lambda_{j-1})^{p_{j-1}}(x-\lambda_j)^{p_j-1}(x-\lambda_{j+1})^{p_j+1}\cdots (x-\lambda_s)^{p_s}$$ satisfies $Q(A)v = 0$ so $v \in S$.
Conversely, let $v = \sum_{i=1}^s c_iw_i$ be such that $c_i \ne 0$ and $w_i \in \ker (A - \lambda_i I)^{p_i} \setminus \ker (A - \lambda_i I)^{p_i-1}$ for all $1 \le i \le s$. Assume $Q(A)v =0$ for some $Q \in \mathbb{C}[x]$.
$$0 = Q(A)v = \sum_{i=1}^s c_iQ(A)w_i \implies c_iQ(A)w_i = 0, \quad 1\le i \le s$$
by linear independence of generalized eigenvectors. Because $c_i \ne 0$, we conclude $Q(A)w_i = 0$ for all $1\le i \le s$.
Now, it is known that $(A - \mu I)|_{\ker (A - \lambda_i I)^{p_i}}$ is invertible for $\mu \ne \lambda_i$ so if we write $Q(x) = (x-\mu_1)\cdots(x-\mu_k)(x-\lambda_i)^{m_i}$ we have
$$0 = Q(A)v = (A-\mu_1 I)\cdots(A-\mu_k I)\underbrace{(A-\lambda_i I)^{m_i}v}_{\in \ker (A - \lambda_i I)^{p_i}} \implies (A-\lambda_i I)^{m_i}v = 0\implies m_i = p_i$$
for all $1 \le i \le s$. Hence $Q(x) = (x-\lambda_1)^{p_1}\cdots (x-\lambda_s)^{p_s}g(x) = m_A(x)g(x)$ for some $g \in \mathbb{C}[x]$ so $\deg Q \ge n$. Therefore $v \notin S$.