How can I solve for an unknown vector given an otherwise known cross product?

2.5k Views Asked by At

Given known vectors $\vec a$ and $\vec b$, is it possible to solve for $\vec u$, given the following equation?

$\vec a \times \vec u = \vec b$

So far I have found that the following must be true

$\vec u = \vec c + \lambda \hat a$

where

  • $\vec c = \frac{b}{a} (\hat b \times \hat a) = \frac{\vec b \times \vec a}{a^2}$
  • $\lambda \in \Bbb R $

but I'm unsure whether it is possible to further determine $\vec u$, since any vector $\vec u$ such that

$(\vec u \cdot \hat c) \cdot \hat c = \vec c$

should do the trick, which means that $\lambda$ can take any value.

However, when developing the cross product $\vec a \times \vec u = \vec b$ by hand, one can reach a matricial equation of the form

$ A \cdot (\vec x)^t = (\vec b)^t $

where $A$ is a matrix with coefficients from $ \vec a $ and which is solvable with at least one method, one example thereof being

$ (\vec x)^t = A^{-1} \cdot (\vec b)^t $

Am I doing something wrong? Or is this equation truly not solvable using vectorial math?

3

There are 3 best solutions below

1
On BEST ANSWER

The operator $L_{\vec{a}} \colon \mathbb{R}^3 \rightarrow \mathbb{R}^3$ given by

$$ L_{\vec{a}}(\vec{u}) = \vec{a} \times \vec{u} $$

is a linear operator. There are two cases:

  1. If $\vec{a} = 0$ then $L_{\vec{a}}$ is the zero operator and so your equation is solvable if and only if $\vec{b} = 0$ in which case any vector $\vec{u} \in \mathbb{R}^3$ is a solution of your equation.
  2. If $\vec{a} \neq 0$ then the operator $L_{\vec{a}}$ has a one-dimensional kernel (spanned by $\vec{a}$) and a two-dimensional image (given by $\operatorname{span} \{ \vec{a} \}^{\perp}$). In this case, your equation is solvable if and only if $\vec{b} \perp \vec{a}$ and then there won't be a unique solution but a one-dimensional family of solutions $\vec{u} = \vec{u}_0 + t\vec{a}$ where $\vec{u}_0$ is (any) particular solution. Indeed, we have $$ \vec{a} \times (\vec{b} \times \vec{a}) = \vec{a} \times ((b_2 a_3 - b_3 a_2) \vec{e_1} + (b_3 a_1 - b_1 a_3) \vec{e}_2 + (b_1 a_2 - b_2 a_1) \vec{e}_3) = \\ (a_2 (b_1 a_2 - b_2 a_1) - a_3 (b_3 a_1 - b_1 a_3)) \vec{e}_1 + \\ (a_3 (b_2 a_3 - b_3 a_2) - a_1(b_1a_2 - b_2 a_1)) \vec{e_2} +\\ (a_1(b_3a_1 - b_1 a_3) - a_2 (b_2a_3 - b_3a_2)) \vec{e}_3 = \\ (b_1(a_2^2 + a_3^3) - a_1(a_2b_2 + a_3b_3))\vec{e}_1 + \\ (b_2(a_1^2 + a_3^2) - a_2(a_1b_1 + a_3b_3))\vec{e}_2 + \\ +(b_3(a_1^2 + a_2^2) - a_3(a_1b_1 + a_2b_3) )\vec{e}_3 = \\ \vec{b} \|\vec{a}\|^2 - (\vec{a} \cdot \vec{b}) \vec{a} $$ so if $\vec{b} \perp \vec{a}$ then $\vec{u} := \frac{\vec{b} \times \vec{a}}{\| \vec{a}\|^2}$ solves the equation and all the solutions have the form $\frac{\vec{b} \times \vec{a}}{\| \vec{a}\|^2} + t\vec{a}$ for some $t \in \mathbb{R}$.

In any case, the operator $L_{\vec{a}}$ is not invertible so if you represent the equation $L_{\vec{a}}(\vec{u}) = \vec{b}$ as a matrix equation $A\vec{u} = \vec{b}$ the matrix $A$ won't be of full rank so you can't invert it.

1
On

We will find a vector $\vec u$ that satisfies $$\vec a \times \vec u = \vec b.$$ Assuming $\vec a,\vec b\ne \vec 0$ we have that $$\vec u=\frac{|\vec b|}{|\vec a||\vec a\times \vec b|}\vec a\times \vec b.$$

If $\vec b=\vec 0$ then $\vec u=\vec a$ is a solution.

Of course, if $\vec u$ is a solution then $\vec u+\lambda \vec a$ is a solution. And there are no more solutions. Why? Because $\vec b$ is perpendicular to the plane generated by $\vec a$ and $\vec u.$

1
On

Cross product can be written as matrix multiplication:

$$\pmatrix{b_1 \\ b_2 \\ b_3} = \vec{b} = \vec{a} \times \vec{u} = \pmatrix{a_1 \\ a_2 \\ a_3} \times \pmatrix{u_1 \\ u_2 \\ u_3} = \pmatrix{a_2u_3 - a_3u_2 \\ a_3u_1 - a_1u_3 \\ a_1u_2 - a_2u_1} = \begin{bmatrix} 0 & -a_3 & a_2 \\ a_3 & 0 & -a_1 \\ -a_2 & a_1 & 0\end{bmatrix}\pmatrix{u_1 \\ u_2 \\ u_3} $$

Now this can be solved for $\vec{u}$ like any linear system, e.g. with row reduction. WLOG assume $a_3 \ne 0$.

$$\left[ \begin{array}{ccc|c} 0 & -a_3 & a_2 & b_1 \\ a_3 & 0 & -a_1 & b_2 \\ -a_2 & a_1 & 0 & b_3 \end{array} \right] \sim \left[ \begin{array}{ccc|c} 1 & 0 & -\frac{a_1}{a_3} & \frac{b_2}{a_3} \\ 0 & 1 & -\frac{a_2}{a_3} & -\frac{b_1}{a_3}\\ 0 & 0 & 0 & \frac{a_1b_1 + a_2b_2}{a_3} + b_3 \end{array} \right] $$

We see that a solution exists if and only if $b_3 = -\frac{a_1b_1 + a_2b_2}{a_3}$ and in that case the it is given by:

$$\vec{u} = \pmatrix{u_1 \\ u_2 \\ u_3} = \pmatrix{\frac{b_2}{a_3} \\ -\frac{b_1}{a_3} \\ 0} + \lambda \pmatrix{a_1 \\ a_2 \\ a_3} = \pmatrix{\frac{b_2}{a_3} \\ -\frac{b_1}{a_3} \\ 0} +\lambda \vec{a}$$ for some $\lambda \in \mathbb{R}$.