I need help proving the following:
If $u \cdot(v \times w)=0$ but $v \times w \not= 0 $, show that there are constants $\lambda$ and $\mu$ such that: $u=\lambda v + \mu w$
This is for a college multivariable calculus class. I have tried expanding the expressions but it's no use. Thank you.
Because there is a cross-product, $``\times"$, you must be using vectors in $\mathbb R^3$. Let $\mathbf u = (u_1, u_2, u_3)$, $\mathbf v = (v_1, v_2, v_3)$ and $\mathbf w = (w_1, w_2, w_3)$. Then
$$ \left| \begin{array}{ccc} u_1 & u_2 & u_3 \\ v_1 & v_2 & v_3 \\ w_1 & w_2 & w_3 \\ \end{array} \right| = \mathbf u \cdot (\mathbf v \times \mathbf w) = 0 $$
This implies that $\mathbf u, \mathbf v$ and $\mathbf w$ are linearly dependent. Since $\mathbf v \times \mathbf w \ne \mathbf 0$, then $\mathbf v$ and $\mathbf w$ are linearly independent. Hence we must have $\mathbf u = \alpha \mathbf v + \beta \mathbf w$ for some scalars $\alpha$ and $\beta$.