I am reading Richard P. Feynman's book.
In this book, Feynman wrote the following equality without a proof.
$$\mathbf{A}\times (\mathbf{B}\times\mathbf{C})=\mathbf{B}(\mathbf{A}\cdot\mathbf{C})-\mathbf{C}(\mathbf{A}\cdot\mathbf{B}).$$
I am not interested in a rigorous proof.
I am interested in a visual or intuitive proof.
My attempt:
Visually, it is obvious that $\mathbf{A}\times (\mathbf{B}\times\mathbf{C})$ is a linear combination of $\mathbf{B}$ and $\mathbf{C}$.
So, we can write $\mathbf{A}\times (\mathbf{B}\times\mathbf{C})=\beta\mathbf{B}+\gamma\mathbf{C}$ for some real numbers $\beta,\gamma$.
$0=\mathbf{A}\cdot (\mathbf{A}\times (\mathbf{B}\times\mathbf{C}))=\beta\mathbf{A}\cdot\mathbf{B}+\gamma\mathbf{A}\cdot\mathbf{C}.$
We assume that $\mathbf{A}\cdot\mathbf{B}\neq 0$ or $\mathbf{A}\cdot\mathbf{C}\neq 0$.
Then, we can write $\beta=t(\mathbf{A}\cdot\mathbf{C})$ and $\gamma=-t(\mathbf{A}\cdot\mathbf{B})$ for some real number $t$.
I want to show visually or intuitively that $t=1$.
Another attempt (I noticed this attempt is wrong. $\beta\neq\frac{(\mathbf{B}\times\mathbf{A})\cdot (\mathbf{B}\times\mathbf{C})}{|\mathbf{B}|^2}$. I think $\beta=\frac{(\mathbf{B}'\times\mathbf{A})\cdot (\mathbf{B}'\times\mathbf{C})}{|\mathbf{B}'|^2}$ is right. Please see my answer below.):
$\frac{\mathbf{B}\cdot (\mathbf{A}\times (\mathbf{B}\times\mathbf{C}))}{|\mathbf{B}|}=|(\mathbf{A}\times (\mathbf{B}\times\mathbf{C}))|\cos\theta.$
So, $\beta=\frac{\mathbf{B}\cdot (\mathbf{A}\times (\mathbf{B}\times\mathbf{C}))}{|\mathbf{B}|^2}=\frac{(\mathbf{B}\times\mathbf{A})\cdot (\mathbf{B}\times\mathbf{C})}{|\mathbf{B}|^2}$. (I used the equality $\mathbf{A}\cdot (\mathbf{B}\times\mathbf{C})=(\mathbf{A}\times\mathbf{B})\cdot\mathbf{C}$ which is visually or intuitively obvious.)
I want to show visually or intuitively that $\frac{(\mathbf{B}\times\mathbf{A})\cdot (\mathbf{B}\times\mathbf{C})}{|\mathbf{B}|^2}=\mathbf{A}\cdot\mathbf{C}$.


If $B$ and $C$ are linearly dependent, then $B \times C = 0,$ so the LHS of the identity to be proved is $0$.
Also, if $B$ and $C$ are linearly dependent, the RHS of the identity is $0.$ For example, if $C = \lambda B,$ then $$ B(A \cdot C) - C(A \cdot B) = B(A \cdot (\lambda B)) - (\lambda B)(A \cdot B) = \lambda B (A \cdot B) - \lambda B (A \cdot B) = 0. $$
Suppose now that $B$ and $C$ are linearly independent. Then $A$ is a linear combination of $B, C,$ and $B \times C.$
Both sides of the identity are linear in $A,$ so it is enough to prove it when $A$ is equal to $B, C,$ or $B \times C.$
If $A = B \times C,$ then $A \times (B \times C) = 0$ and $A \cdot B = 0$ and $A \cdot C = 0,$ so the identity holds.
When $A = B,$ the identity reduces to $$ \label{4726365:eq:1}\tag{1} B \times (B \times C) = B(B \cdot C) - \|B\|^2C. $$ If \eqref{4726365:eq:1} holds, then swapping the roles of $B$ and $C$ in it gives $$ C \times (B \times C) = - C \times (C \times B) = - C(C \cdot B) + \|C\|^2B, $$ which is the identity for the case $A = C.$
So it is enough to prove \eqref{4726365:eq:1}.
Let $\theta$ be the angle between $B$ and $C.$ Then $B \times (B \times C)$ is in the plane of $B$ and $C,$ is orthogonal to $B,$ and has magnitude $\|B\|^2\|C\||\sin\theta|.$ Therefore, it is equal to $\pm\|B\|^2$ times the component of $C$ orthogonal to $B.$
To resolve the ambiguity of sign, we need to be more precise about the angle $\theta.$ We may assume without loss of generality that $0 < \theta < \pi,$ and the vectors $B, C, B \times C$ in that order form a right-handed system. Regarding $C$ as lying at angle $\theta$ anticlockwise from $B,$ and $B \times C$ as pointing vertically upwards, we see that $B \times (B \times C)$ points towards the opposite side of $B$ from $C.$
Therefore $B \times (B \times C)$ is equal to $-\|B\|^2$ times the component of $C$ orthogonal to $B.$
But the RHS of \eqref{4726365:eq:1} is $-\|B\|^2C$ plus $\|B\|^2$ times the component of $C$ parallel to $B.$ Therefore \eqref{4726365:eq:1} holds. $\ \square$