I'd like to know basic rules of inference in proofs of vector identities based on geometric algebra that avoid bases and coordinates.
For example how to prove the identity:
$$ \operatorname{Rot} \left(fA\right)=\left(\operatorname{Grad} f \right)\times A+ f \operatorname{Rot} A $$
And what are some general tips, rules and tricks that come up in other such proofs?
This has very little to do with geometric algebra. We just need a couple of things:
As one more example, we derive an expression for $\nabla\times(A\times B)$. Recall that we have the vector identity $$ u\times(v\times w) = (u\cdot w)v - (u\cdot v)w. $$ Using this we see $$\begin{aligned} \dot\nabla\times(\dot A\times\dot B) &= (\dot\nabla\cdot\dot B)\dot A - (\dot\nabla\cdot\dot A)\dot B \\ &= (\dot\nabla\cdot\dot B)A + (\dot\nabla\cdot B)\dot A - (\dot\nabla\cdot\dot A)B - (\dot\nabla\cdot A)\dot B \\ &= (\nabla\cdot B)A + (B\cdot\nabla)A - (\nabla\cdot A)B - (A\cdot\nabla)B. \end{aligned}$$ In the last step, we have dropped the dots and used the standard convention of "differentiate directly to the right".
Elaboration and Well-Definedness
The operator $\nabla$ together with overdot notation satifies all generic and linear properties that vectors satisfy. Generic means the property has to be true for all vectors. You can justify this with index notation; we simply start with a tensor identity and then contract each side with $\partial_i$. Alternatively, see the third section of my answer here. Notice that there is a chain rule there that I forgot to cover here.
Let's use the $u\times(v\times w)$ identity in my answer as an example. It is generic in $u$ since it is true for any $u$, and it is linear since both sides of the identity are linear functions of $u$. Thus we may replace $u$ with $\dot\nabla$ and arrive at a valid identity; in fact we get four identities: $$ \dot\nabla\times(v\times w) = (\dot\nabla\cdot w)v - (\dot\nabla\cdot v)w, \tag1 $$$$ \dot\nabla\times(\dot v\times w) = (\dot\nabla\cdot w)\dot v - (\dot\nabla\cdot\dot v)w, \tag2 $$$$ \dot\nabla\times(v \times\dot w) = (\dot\nabla\cdot\dot w)v - (\dot\nabla\cdot v)\dot w, \tag3 $$$$ \dot\nabla\times(\dot v\times\dot w) = (\dot\nabla\cdot\dot w)\dot v - (\dot\nabla\cdot\dot v)\dot w. \tag4 $$ We may consistently interpret $\dot\nabla$ with no other dotted variables as differentiating the constant $1$, or symbolically $\dot\nabla = \dot\nabla\dot 1$; thus equation (1) is simply saying $0 = 0$.
In index notation the vector identity is $$ \epsilon^i_{jk}\epsilon^k_{lm}u^jv^lw^m = u^jw_jv^i + u^jv_jw^i. $$ Each of the equations above are then saying $$ \epsilon^i_{jk}\epsilon^k_{lm}(\partial^j1)v^lw^m = (\partial^j1)w_jv^i + (\partial^j1)v_jw^i, \tag1 $$$$ \epsilon^i_{jk}\epsilon^k_{lm}(\partial^jv^l)w^m = w_j(\partial^jv^i) + (\partial^jv_j)w^i, \tag2 $$$$ \epsilon^i_{jk}\epsilon^k_{lm}v^l(\partial^jw^m) = (\partial^jw_j)v^i + v_j(\partial^jw^i), \tag3 $$$$ \epsilon^i_{jk}\epsilon^k_{lm}\partial^j(v^lw^m) = \partial^j(w_jv^i) + \partial^j(v_jw^i). \tag4 $$
Higher derivatives are much that same, but each derivative needs to separately track what it differentiates. An expression like $\dot\nabla\cdot\dot\nabla(\dot u\cdot\dot v)$ makes no sense. We will use a check $\check\nabla$ exactly like an overdot, but having both symbols allows us to write all the possible (non-trivial) second derivatives: $$ \check\nabla\cdot\dot\nabla(\check{\dot u}\cdot v),\quad \check\nabla\cdot\dot\nabla(\dot u\cdot\check v),\quad \check\nabla\cdot\dot\nabla(\check u\cdot\dot v),\quad \check\nabla\cdot\dot\nabla(u\cdot\check{\dot v}),\quad \check\nabla\cdot\dot\nabla(\dot u\cdot\check{\dot v}),\quad \check\nabla\cdot\dot\nabla(\check u\cdot\check{\dot v}),\quad \check\nabla\cdot\dot\nabla(\check{\dot u}\cdot\dot v),\quad \check\nabla\cdot\dot\nabla(\check{\dot u}\cdot\check v),\quad \check\nabla\cdot\dot\nabla(\check{\dot u}\cdot\check{\dot v}). $$ The last expression is the Laplacian. The fact that partial derivatives commute means that e.g. $\check{\dot u} = \dot{\check u}$.
The rule for higher derivatives is the same as first derivatives: each $\nabla$ must appear in a generic, linear slot. Continuing with our example $$ u\times(v\times w) = (u\cdot w)v - (u\cdot v)w, $$ we see $$ \check\nabla\times(\dot\nabla\times\check{\dot w}) = (\check\nabla\cdot\check{\dot w})\dot\nabla - (\check\nabla\cdot\dot\nabla)(\check{\dot w}) = \nabla(\nabla\cdot w) - \nabla^2w. $$