Proving $(\bf x\times y\cdot N)\ z+(y\times z\cdot N)\ x+(z\times x \cdot N)\ y= 0$ when $\bf x,y,z$ are coplanar and $\bf N$ is a unit normal vector

941 Views Asked by At

Prove that if $\mathbf{x},\mathbf{y},\mathbf{z} \in \mathbb{R}^3$ are coplanar vectors and $\mathbf{N}$ is a unit normal vector to the plane then $$(\mathbf{x}\times\mathbf{y} \cdot \mathbf{N})\ \mathbf{z} + (\mathbf{y}\times\mathbf{z} \cdot \mathbf{N})\ \mathbf{x} + (\mathbf{z}\times\mathbf{x} \cdot \mathbf{N})\ \mathbf{y}=\mathbf{0}.$$

This is an elementary identity involving cross products which is used in the proof of the Gauss-Bonnet Theorem and whose proof was left as an exercise. I've tried it unsuccessfully. Initially I tried writing $\mathbf{N}=\frac{\mathbf{x}\times\mathbf{y}}{\| \mathbf{x}\times\mathbf{y}\|}=\frac{\mathbf{y}\times\mathbf{z}}{\| \mathbf{y}\times\mathbf{z}\|}=\frac{\mathbf{z}\times\mathbf{x}}{\| \mathbf{z}\times\mathbf{x}\|}$ and substituting into the equation to get $\| \mathbf{x}\times\mathbf{y}\|z +\| \mathbf{y}\times\mathbf{z}\|\mathbf{x}+\| \mathbf{z}\times\mathbf{x}\|\mathbf{y}=\mathbf{0}$ but then I realised these terms are only correct up to $\pm$ signs. You could write the norms in terms of sines of angles and divide by norms to get unit vectors with coefficients $\sin\theta,\sin\psi,\sin(\theta+\psi)$ (or $2\pi -(\theta+\psi)$ I suppose) but I don't know what to do from there, especially when the terms are only correct up to sign. Any hints how to prove this identity? Perhaps there is a clever trick to it but I can't see it. Edit: Maybe writing $\mathbf{z}=\lambda\mathbf{x}+\mu\mathbf{y}$ will help.

7

There are 7 best solutions below

3
On BEST ANSWER

Here's an observation: If $Q$ is a rotation matrix, then $$ (Qx) \times (Qy) = Q(x \times y) $$

You have to prove that, of course, but it's not too tough. Similarly, $$ (Qx) \cdot (Qy) = x \cdot y $$ and, for a scalar $\alpha$, we have $$ Q (\alpha x) = \alpha (Q x) $$

Now suppose that for some vector $v$, we have $$ (\mathbf{x}\times\mathbf{y} \cdot \mathbf{N})\ \mathbf{z} + (\mathbf{y}\times\mathbf{z} \cdot \mathbf{N})\ \mathbf{x} + (\mathbf{z}\times\mathbf{x} \cdot \mathbf{N})\ \mathbf{y}=\mathbf{v}. $$

Key idea 1: You can apply the rules above to show that for any rotation matrix $Q$, you can apply $Q$ to all the elements on the left to get $Qv$.

Key idea 2: You can choose $Q$ so that it takes $N$ to the vector $(0,0,1)$, and puts $x, y,$ and $z$ into the plane consisting of vectors of the form $(a, b, 0)$. And in that plane, it's easy to see that you get $0$, so $Qv = 0$. Hence $v = 0$, and you're done.

In short: by a change of basis, you can assume that $N$ is the vector $(0,0,1)$ and that the other vectors all lie in the $(a, b, 0)$ plane, and things get easy.

1
On

Writing $x=a\hat{i}+b\hat{j},\,y=c\hat{i}+d\hat{j},\,z=e\hat{i}+f\hat{j},\,N=\hat{k}$ reduces the sum to $$((ad-bc)(e\hat{i}+f\hat{j})+(cf-de)(a\hat{i}+b\hat{j})+(be-af)(c\hat{i}+d\hat{j})).$$The $\hat{i}$ coefficient is $ade-bce+acf-ade+bce-acf=0$. The $\hat{j}$ coefficient can be handled similarly.

0
On

By the properties of the triple product ( circluar shift) we can rearrange formula:

$\ \ \ \ (\mathbf{x}\times\mathbf{y}) \cdot \mathbf{N})\ \mathbf{z} + (\mathbf{y}\times\mathbf{z}) \cdot \mathbf{N})\ \mathbf{x} + (\mathbf{z}\times\mathbf{x}) \cdot \mathbf{N})\ \mathbf{y} \\ =(\mathbf{N}\times\mathbf{x}) \cdot \mathbf{y})\ \mathbf{z} + (\mathbf{N}\times\mathbf{y}) \cdot \mathbf{z})\ \mathbf{x} + (\mathbf{N}\times\mathbf{z}) \cdot \mathbf{x})\ \mathbf{y} $

All cross product vectors $$v_1=(\mathbf{N}\times\mathbf{x}),v_2=(\mathbf{N}\times\mathbf{y}), v_3=(\mathbf{N}\times\mathbf{z})$$
lie in the plane of coplanar vectors $\mathbf{x},\mathbf{y},\mathbf{z}$ and they are vectors $\mathbf{x},\mathbf{y},\mathbf{z}$ rotated by $\pi/2$ in this plane.

So we can limit themselves to this plane and take any vectors with components $\mathbf{x}=[ x_1 \ \ x_2]^T,\mathbf{y}=[ y_1 \ \ y_2]^T,\mathbf{z} =[ z_1 \ \ z_2]^T$.

Transform them with the rotation matrix $R=\begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix}$ , calculate appropriate dot products and finally check the formula with these assumed general components.

Namely we need to calculate: $$(y^TRx)z+(z^TRy)x+(x^TRz)y$$

3
On

If what is required is only to prove the validity of the given identity, there is another approach. Observe that if $x$ and $y$ are linearly dependent, i.e. for some $c$, $x=c y$ or $y=c x$, then the identity holds trivially because $w\times v =-(v\times w)$ and $v \times v=0$ for all $v,w$. Thus, we may assume $x$ and $y$ are linearly independent and hence $z$ is a linear combination of $x$ and $y$, that is, $z=ax+by$ for some $a,b$. Now, since the given identity is linear in each variable and it holds for both $z=x$ and $z=y$, it is also true for $z=ax+by$. This proves the identity. It can be also noted that $N$ being perpendicular to the plane containing $x,y,z$ plays no role in this proof.

0
On

Since $\bf x, \bf y, \bf z$ are coplanar, they are linearly dependent. Since the result to be proved is symmetric in $\bf x, \bf y, \bf z$, withouht loss of generality we can write $\bf z = \lambda \bf x + \mu \bf y$ for some scalars $\lambda, \mu$.

Now, $$\begin{align} & (\bf y \times \bf z \cdot \bf N)\; \bf x \\ =\ & (\bf y \times (\lambda \bf x + \mu \bf y) \cdot \bf N)\; \bf x \\ =\ & (\bf y \times \lambda \bf x \cdot \bf N)\; \bf x \\ =\ & (\bf y \times \bf x \cdot \bf N)\, (\lambda \bf x) \end{align}$$ and similarly $$\begin{align} & (\bf z \times \bf x \cdot \bf N)\; \bf y \\ =\ & (\bf y \times \bf x \cdot \bf N)\, (\mu \bf y) \end{align}$$ So $$\begin{align} & (\bf y \times \bf z \cdot \bf N)\; \bf x + (\bf z \times \bf x \cdot \bf N)\; \bf y \\ =\ & (\bf y \times \bf x \cdot \bf N)\, (\lambda \bf x + \mu \bf y)\\ =\ & -(\bf x \times \bf y \cdot \bf N)\;z \end{align} $$ and the result follows.

0
On

If you know a little about the exterior algebra we can see this almost immediately, and in a way that generalizes substantially.

Pick any plane $\Pi$ containing ${\bf x}, {\bf y}, {\bf z}$. The map on $\Pi$ defined by $$({\bf a}, {\bf b}, {\bf c}) \mapsto [({\bf a} \times {\bf b}) \cdot {\bf N}] {\bf c} + [({\bf b} \times {\bf c}) \cdot {\bf N}] {\bf a} + [({\bf c} \times {\bf a}) \cdot {\bf N}] {\bf b}$$ is visibly trilinear and totally skew in its arguments, so it is a (vector-valued) $3$-form on a $2$-dimensional vector space and hence is the zero map.

NB this argument doesn't use any properties of $\bf N$.

0
On

Another approach to the problem uses a formula for triple product.

$ \mathbf{a}\cdot(\mathbf{b}\times \mathbf{c}) = \det \begin{bmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \\ \end{bmatrix} $

Then consider determinant

$\begin{vmatrix} n_1 & x_1 & y_1 & z_1 \\ n_2 & x_2 & y_2 & z_2 \\ n_3 & x_3 & y_3 & z_3 \\ n_1 & x_1 & y_1 & z_1 \end{vmatrix} $

where columns consist of vectors $ \mathbf{N} ,\mathbf{x},\mathbf{y},\mathbf{z}$ components (the fourth row repeats the first one).

Of course such determinant equals to $0$.
Developing the determinant along the fourth row we obtain:

$-n_1\begin{vmatrix} x_1 & y_1 & z_1 \\ x_2 & y_2 & z_2 \\ x_3 & y_3 & z_3 \\ \end{vmatrix} +x_1\begin{vmatrix} n_1 & y_1 & z_1 \\ n_2 & y_2 & z_2 \\ n_3 & y_3 & z_3 \\ \end{vmatrix} -y_1\begin{vmatrix} n_1 & x_1 & z_1 \\ n_2 & x_2 & z_2 \\ n_3 & x_3 & z_3 \\ \end{vmatrix} +z_1\begin{vmatrix} n_1 & x_1 & y_1 \\ n_2 & x_2 & y_2 \\ n_3 & x_3 & y_3 \\ \end{vmatrix}=0$

from which the formula for the first component of the vector given in the question follows

(the first summand is equal to $0$ as the vectors $\mathbf{x},\mathbf{y},\mathbf{z}$ are collinear, the columns can be permuted (required for the third summand) if needed to give appropriate sign in expression)

Similarly the determinants

$\begin{vmatrix} n_1 & x_1 & y_1 & z_1 \\ n_2 & x_2 & y_2 & z_2 \\ n_3 & x_3 & y_3 & z_3 \\ n_2 & x_2 & y_2 & z_2 \end{vmatrix} $ and $\begin{vmatrix} n_1 & x_1 & y_1 & z_1 \\ n_2 & x_2 & y_2 & z_2 \\ n_3 & x_3 & y_3 & z_3 \\ n_3 & x_3 & y_3 & z_3 \end{vmatrix} $

give the second and the third component of the question vector, equal to $0$.