I have what I think might be a solution, but I'm not sure it's formal enough. I begin by noting that $u,v,w$ are linearly dependent iff. they lie on the same plane. Then I construct the following chain of equivalences:
\begin{align*} u\times v+v\times w+w\times u=\textbf{0}&\Longleftrightarrow u\times v+v\times w=-w\times u\\&\Longleftrightarrow v\times (w-u)=u\times w \end{align*}
Then my reasoning is that since $w-u$ is obviously on the plane that $u,w$ spans, then for $v\times (w-u)$ to form the same normal vector as $u\times w$, v must lie on the same plane. I think the reasoning is correct, but how do I formalize this last part using mathematical notation? Is it necessary to do so?
I assume that we are working in $\mathbb{R}^3$. First, you have to establish that $u$, $v$, and $w$ are linearly independent if and only if $u\cdot(v\times w)\neq 0$. Now, if $u$, $v$, and $w$ are linearly independent, then $$ u\cdot\big(v\times w+w\times v+v\times u)=u\cdot (v\times w) \neq 0\,,$$ whence $v\times w+w\times u+u\times v\neq 0$. This shows that, if $v\times w+w\times u+u\times v=0$, then $u$, $v$, and $w$ are linearly dependent.
Conversely, if $u$, $v$, and $w$ are linearly dependent, then there are three scenarios. First, $u=v=w=0$, which obviously means $v\times w+w\times u+u\times v=0$. Second, $u$, $v$, and $w$ are scalar multiples of a single vector $x\neq 0$. Then, again, we clearly have $v\times w+w\times u+u\times v=0$. Finally, suppose that $u$, $v$, and $w$ spans a $2$-dimensional subspace of $\mathbb{R}^3$. Then, we may assume without loss of generality that $u$ and $v$ are linearly independent, and $w=au+bv$ for some $a,b\in\mathbb{R}$. That is, $$v\times w+w\times u+u\times v=-a\,(u\times v)-b\,(u\times v)+(u\times v)=(1-a-b)\,(u\times v)\,,$$ which is nonzero if $a+b\neq 1$. Therefore, we have a partial converse, namely, if $$v\times w+w\times u+u\times v\neq 0\,,$$ then the subspace of $\mathbb{R}^3$ spanned by $u$, $v$, and $w$ is at least $2$-dimensional.