I'm trying to understand the exterior derivative in the simplest context that I can. I feel like I understand how an exterior derivative should behave for a function. For example, let's take a simple function:
$$ f(x^1, x^2) = 3x^1 + sin(x^2) $$
Here we can find the exterior derivative as shown:
$$ df = {\partial f \over \partial x^1 } dx^1 + {\partial f \over \partial x^2 } dx^2 = 3dx^1 + cos(x^2) dx^2 $$
Now we have a differential form, which we generated from the function.
I was wondering if we could also use the exterior derivative on a vector field in $\mathbb R^2 $ in the same way.
Question
So if I have another function defined so that:
$$ f(x^1, x^2) = (x^1 + x^2) \mathbf{e_1} + ( x^1 x^2 ) \mathbf{e_2} \quad where \quad \mathbf{e_i} = {\partial f \over \partial x_i } $$
Could I still take its exterior derivative in the same way? I figured that if I treat each component as a separte function, so I have:
$$ f(x^1, x^2) = f^1(x^1, x^2) \mathbf{e_1} + f^2(x^1, x^2) \mathbf{e_2} $$
I can treat each component as a function and find a differential form for each, but then I realised that the basis vectors would have to come into play here.
Using the linearity of the $d$ operator (and leaving out the arguments to the functions), it should be true that:
$$ df = d( f^1 \mathbf{e_1} + f^2 \mathbf{e_2} ) \\df = d(f^1 \mathbf{e_1}) + d(f^2 \mathbf{e_2} ) $$
Then using the Leibniz property, I'd get:
$$ d(f^1 e_1) = d(f^1) \mathbf{e_1} + f^1 d(\mathbf{e_1}) $$
So then the question becomes:
Does the exterior derivative of a basis vector make sense? So is: $$ d(\mathbf{e_1}) $$ Well defined? If so, could I expand it in terms of the partial derivatives of $x^i$ to write: $$ d(\mathbf{e^i}) = {\partial \mathbf{e_i} \over \partial x^i } dx^i $$