Why does $e_1 \cdot (e_2 \wedge e_3) = 0$?

59 Views Asked by At

I am a little embarrassed to admit that this problem from Alan Macdonald's Linear and Geometric Algebra book has me stumped. On page 83, problem 5.3.5:

Show that $e_1 \cdot (e_2 \wedge e_3)=0$.

I tried applying the equation:

$$\mathrm{u \cdot v = \frac{1}{2} (uv+vu)}$$

I had:

$$e_1 \cdot (e_2 \wedge e_3) = e_1 \cdot (e_2 e_3)$$ $$= \frac{1}{2} (e_1 (e_2 e_3) + (e_2 e_3) e_1)$$ $$= \frac{1}{2} (e_1 e_2 e_3 + e_2 e_3 e_1)$$ $$= \frac{1}{2} (e_1 e_2 e_3 - e_2 e_1 e_3)$$ $$= \frac{1}{2} (e_1 e_2 e_3 + e_1 e_2 e_3)$$ $$= e_1 e_2 e_3$$

Which is not the answer I expected to get, intuitively or based on the original question. I must be making an incorrect assumption but I can't figure out what it is.