I'm trying to prove the following identity using Einstein's Summation Convention:
$$\vec{\nabla}\times(\vec{\nabla}\times\vec{F})=\vec{\nabla}(\vec{\nabla}\cdot\vec{F})-\vec{\nabla}^2\vec{F}$$
Where, in Cartesian coordinates: $\vec{\nabla}^2\vec{F}=(\vec{\nabla}^2F_x,\vec{\nabla}^2F_y,\vec{\nabla}^2F_z)$.
My Approach: To make calculations easier, I chose to use the following notation:
$$\partial_{x_i}\equiv\frac{\partial}{\partial x_i},\quad (x_1,x_2,x_3)\equiv(x,y,z)$$
$\delta$ is Kronecker's Delta, $\varepsilon$ is the Levi-Civita tensor. I will denote vectors by capital letters with an arrow, and scalars with lowercase letters. Thus, using Einstein's notation:
$$\vec{G}=\vec{\nabla}\times\vec{F}=\varepsilon_{ijk}\partial_{x_j}F_k\vec{e}_i\implies G_c=\varepsilon_{cjk}\partial_{x_j}F_k\\ (LHS)_a=(\vec{\nabla}\times\vec{G})_a=\varepsilon_{abc}\partial_{x_b}G_c=\varepsilon_{abc}\varepsilon_{cjk}\partial_{x_b}\partial_{x_j}F_k$$
Since $\varepsilon_{abc}=\varepsilon_{cab}$ and $\varepsilon_{abc}\varepsilon_{cab}\equiv\delta_{aj}\delta_{bk}-\delta_{ak}\delta_{bj}$, we conclude that:
$$(LHS)_a=\delta_{aj}\delta_{bk}\partial_{x_b}\partial_{x_j}F_k-\delta_{ak}\delta_{bj}\partial_{x_b}\partial_{x_j}F_k$$
As for the RHS:
$$g=\vec{\nabla}\cdot\vec{F}=\partial_{x_j}F_j\\\vec{H}=\vec{\nabla}g=\partial_{x_a}g\vec{e}_a=\partial_{x_a}\partial_{x_j}F_j\vec{e}_a\implies H_a=\partial_{x_a}\partial_{x_j}F_j\\\vec{P}=\vec{\nabla}F_a=\partial_{x_b}F_a\vec{e}_b\implies P_b=\partial_{x_b}F_a\\\vec{R}=\vec{\nabla}^2\vec{F}\implies R_a=\vec{\nabla}^2F_a=\vec{\nabla}\cdot(\vec{\nabla}F_a)=\vec{\nabla}\cdot\vec{P}=\partial_{x_b}P_b=\partial_{x_b}^2F_a$$
Since $RHS=\vec{H}-\vec{R}$:
$$(RHS)_a=H_a-R_a=\partial_{x_a}\partial_{x_j}F_j-\partial_{x_b}^2F_a$$
In conclusion, I need to show that:
$$\delta_{aj}\delta_{bk}\partial_{x_b}\partial_{x_j}F_k-\delta_{ak}\delta_{bj}\partial_{x_b}\partial_{x_j}F_k=\partial_{x_a}\partial_{x_j}F_j-\partial_{x_b}^2F_a$$
Where $j,k,b\in\left\{1,2,3\right\}$ are summation variables, and $a$ represents the index of each side (meaning a is not a summation variable). The sides of the equation seem similar, but maybe I was wrong somewhere. Nonetheless, I couldn't simplify, unfortunately, the LHS in order to show it is equal to the RHS.
Thank You!
Well since this took ages to type I feel bad to delete this so I would just complete the proof, maybe someone would need this someday.
Well, let's simplify the LHS. $\delta_{aj}\delta_{bk}$ is nonzero only when $a=j$ and $b=k$. This would eliminate the second (negative) term, and then the first (positive) term would be:
$$a=j,b=k\implies\partial_{x_a}\partial_{x_j}F_j$$
Now, $\delta_{ak}\delta_{bj}$ is nonzero only when $a=k$ and $b=j$. This would eliminate the first (positive) term, and then the second (negative) term would be:
$$a=k,b=j\implies-\partial_{x_b}^2F_a$$
Summing them up we get:
$$(LHS)_a=\partial_{x_a}\partial_{x_j}F_j-\partial_{x_b}^2F_a$$
And this completes the proof.