Suppose that $u,v,w,t$ are vectors in space such that $u\times v = w \times t$ and $u \times w = u \times t$. Show that $u-t$ and $v-w$ form a pair of linearly dependent vectors.
I'm supposed to calculate $(u-t) \times (v-w)$ and use that $x \times y = 0$ iff $(x,y)$ are LD, but I'm stuck.
$$\begin{aligned}(u-t) \times (v-w) &= u \times v - u \times w - t \times v + t \times w \\ &= -t \times w - u \times t - t \times v + t \times w \\ &= t \times u - t \times v \\ &= t \times (u - v)\end{aligned}$$.
The statement, as currently written, is not true. Take
$$\begin{align*} u &= (1, 0, 0) \\ w &= (1, 1, 0) \\ t &= (0, 1, 0) \\ v &= (-1, 1, 0). \end{align*}$$
It follows that
$$u\times w = u\times t = u\times v = w\times t = (0, 0, 1),$$
but
$$\begin{align*} u - t &= (1, -1, 0) \\ v - w &= (-2, 0, 0), \end{align*}$$
which are not linearly dependent.
However, if we change the second equality in the statement to be $u \times w = v \times t$, then your calculations will be slightly modified and show that it is true.