Using geometric algebra, one may define the multivector derivative $∂_X$ with respect to a general multivector $X$ as $$ ∂_X ≔ \sum_J ^J (_J * ∂_X) $$ where each “component” $_J * ∂_X$ is defined by $$ (_J * ∂_X)f(X) ≔ \frac{\mathrm{d}}{\mathrm{d}\tau}f(X + \tau_J)\big|_{\tau=0} .$$
Notations
- $A * B \equiv ⟨AB⟩_0$ denotes the scalar product. For any multivector $A$, we have $^J(_J * A) = A$. In literature, parentheses are often dropped with the understanding that $A*B C ≡ (A*B)C$.
- We employ multi-index notation, $_J = _{j_1}∧\cdots∧_{j_k}$. (If $k = 0$ then $_J = 1$). Reciprocal bases are reversed, $^J = ^{j_k}∧\cdots∧^{j_1}$, so that $^I * _J = δ^I_J$ is always satisfied.
Problem
I’m having a humiliating time trying to sanity-check this definition by verifying, e.g., $∂_X X = n$, as stated in eq. (2.29) of [2] or eq. (7.8) of [3]. My computation begins as follows. $$ ∂_X X = \sum_J ^J (_J * ∂_X)X = \sum_J ^J \frac{\mathrm{d}}{\mathrm{d}\tau} (X + \tau_J)\big|_{\tau=0} = \sum_J ^J _J .$$ There seems to be no room for confusion here. But this is not $n$. Indeed, \begin{align} \sum_J ^J _J &= \sum_{k=0}^n \sum_{j_1 < \cdots < j_k} \underbrace{^{j_1\cdots j_k}_{j_k\cdots j_1}}_1 = \sum_{k=0}^n \binom{n}{k} = 2^n .\end{align} This contradicts Proof 46 of [3], which includes the step “$\sum_{J_d} δ^J{}_J = d$” (a sum over multi-indices in $d$ dimensions) — which I can’t see to be true!
My failure is easily generalised: in trying to show $∂_X X^2 = 2X$, we have \begin{align} ∂_X X^2 &= ^J(_J * ∂_X)X^2 = ^J \frac{\mathrm{d}}{\mathrm{d}\tau} (X + \tau_J)^2\big|_{\tau=0} \\ &= ^J(_J X + X _J) = 2^n X + ^J X _J .\end{align}
Note that it is easy to verify these results with the less general vector derivative, $\vec ∂ ≔ ^i ∂_i$ where $∂_i = _i * ∂_X$ in the notation above. Then, if $X = X^i_i$ is the position vector, we have $∂_i X = _i$ and thus $ \vec ∂ X = ^i ∂_i X = ^i _i = n $ and $$ \vec ∂ X^2 = ^i \frac{\mathrm{d}}{\mathrm{d}\tau} (X + \tau_i)^2\big|_{\tau=0} = ^i(_i X + X _i) = 2^i \, _i * X = 2X .$$ Clearly I am misinterpreting the way in which the vector derivative $\vec ∂ = ^i(_i * ∂_X)$ is ‘generalised’ to have components at all grades, $∂_X = ^J (_J * ∂_X)$. Could someone with fresh eyes help me out?
References
Your computation is correct, but your understanding of “$d$-dimensional subspace” and the function meant by $X$ in $\partial_XX$ is not. To avoid confusion, let $f(X)$ be the function whose derivative you're trying to compute.
On page 57 of Hestenes and Sobcykz, just above the list of identities (2.28a)-(2.35), it says $X$ is the identity function on a linear subspace of dimension $d$. By this they mean the orthogonal projection onto a certain $d$-dimensional subspace of the algebra. (E.g., $X ↦ ⟨X⟩_1$.) Explicitly, if the space is denoted by $Y$, and we pick an orthonormal basis of multivectors $Y_1,...,Y_d$, then the $X ↦ \sum_{k=1}^d(X*Y_k)Y_k$ is the projection function meant by $X$. Note that $Y_i$ are basis multivectors of the algebra $G(V)$, not basis vectors $_i$ of a vector subspace of $V$.
In your computation, you take $Y$ to be the whole geometric algebra (on an $n$-dimensional space), so your $X$ is simply the identity function $f(X)=X$, and $Y$ has dimension $d=2^n$. This is the sense in which your computation is correct.
For the more general $X$ corresponding to a projection operator, the computation is $(\mathbf e_J * ∂_X)f(X) ≔ \frac{\mathrm{d}}{\mathrm{d}\tau}f(X + \tau\mathbf e_J)\big|_{\tau=0}=f(\mathbf e_J)$ for linear $f$ and for the above-desribed projection operator is explicitly given by $\sum_{k=1}^d\mathbf e_J*Y_k$.
In summary, if $G(V)$ is a geometric algebra over vector space of dimension $\dim V = n$, then $\dim G(V) = 2^n$ and the multivector derivative of the identity is indeed $$ ∂_X X = 2^n .$$
To make contact with the vector derivative, you must include the projection onto the grade-$1$ subspace: $$ ∂_X ⟨X⟩_1 = n .$$
Written differently, the vector and multivector derivatives are related by $\vec ∂ ≔ ⟨∂⟩_1$; $$ \vec ∂ = ⟨∂⟩_1 = ⟨^J (_J * ∂)⟩_1 = ⟨^J⟩_1 _J * ∂ = ^i (_i * ∂) ,$$ where $J$ is a multi-index and $i$ is a single index. Then we have $\vec ∂_X X = n$.
Similarly, your ‘unexpected’ result $$ ∂_X X^2 = 2^n X + ^J X _J $$ is indeed correct. But by including a projection, you can easily verify the familiar results $$ \vec ∂_X X^2 = ⟨∂_X⟩_1 X^2 = 2X \quad\text{or}\quad ∂_X (⟨X⟩_1)^2 = ∂_X ⟨X^2⟩_0 = 2X $$ which you expected for the vector derivative. (In verifying these, note that $^i X _i = (2 - n)X$.)