I have read through multiple books and watched Dr. Theodore Shifrin's video series. However, the questions that I had when going through much of the material have remained largely unanswered. They are:
(1) Notational meaning: From my understanding, a differential form is a covector that takes as input a vector and returns some number. For instance take the 1-form $\omega = dx + 4dy$. Is the "mapping" to some vector $v$ implicit in the definition? For instance, would we more rigorously be able to write out $\omega$ as $ \omega(v) = dx(v) + 4\cdot dy(v)$ ? Let $v$ = ($\mathbf{i} + 2 \cdot \mathbf{j}$). Can we intuitively put the mapping as: $$ \omega(\mathbf{i} + 2 \cdot \mathbf{j}) = dx(\mathbf{i} + 2 \cdot \mathbf{j}) + 4\cdot dy(\mathbf{i} + 2 \cdot \mathbf{j}) = 1 + 8 \cdot 1 = 9$$
(2) Differential forms in fractions: In calculus we often run into certain terms of form $\left({dy \over dx}\right)$. Given the same vector $v$, can this be written rigorously as: $$ {dy(v) \over dx(v)} = {dy(v) \over dx(v)} = {dy(\mathbf{i} + 2 \cdot \mathbf{j}) \over dx(\mathbf{i} + 2 \cdot \mathbf{j})} = {2 \over 1} = 2$$
(3) Ordinary and partial derivatives: For one dimension, we commonly write $d(A) = {\left( \partial A \over \partial x \right)} dx$, where $ {\left( \partial A \over \partial x \right)}$ denotes the partial derivative of $A$ with respect to variable $x$. Let $A$ = $x^2$. In beginner calculus, we see that ${d A \over dx} = 2x$. If we take my assumptions from (1) and (2) to be gospel and treat $dx(v)$ as a scalar then we can put: $$ {d A \over dx} = {(dA)(v) \over dx(v)} = {{\left( \partial A \over \partial x \right)} \cdot dx(v)\over dx(v)} = {\left( \partial A \over \partial x \right)} = {\left( \partial (x^2) \over \partial x \right)} = 2x$$ I am confused however. Some answers on pages like this one and this one seem to suggest that $dx$ and $\partial x$ are cosmetic and interchangeable. However, the $dx$ here seems to behave very differently from $\partial x$, and it almost seems like $\partial x$ acts as some kind of fundamental underpinning operation. Are they completely divorced? Is there some underpinning structure I could know of?
It is hard for me to understand this, especially since I struggle to unwrap my head from the way these things are manipulated in beginner calculus.
I greatly appreciate any help with this.
(1) Yes.
(2) Linear forms as co-vectors should not be divided. But the formula makes sense as a form of the chain rule, in the sense of $\frac{\dot y(t)}{\dot x(t)}=\frac{dy}{dx}=\phi'(x)$ when $y(t)=\phi(x(t))$
(3) What you see in the links is one extreme view. Indeed, in the scalar case the ordinary and the partial differential quotient amount to the same. In the multivariate case there is a marked difference between total and partial derivatives. Your example is circular and uses the ambiguity of the scalar case.
With the partial derivative you always indicate the Jacobian or a part of it of a multivariate function. In contrast, a difference quotient may also be used as symbol for the slope of some composite function, which leads to misunderstandings of what functions are used, what is function and what is a coordinate variable,...
Otherwise you get curiosities like $$\frac{\partial z}{\partial y}\frac{\partial y}{\partial x}\frac{\partial x}{\partial z}=-1$$ on some surface $z=\phi(x,y)$ at points that are not tangent to any coordinate direction. This gets resolved easily by inserting the correct functions, using $y=\gamma(x)$ a solution for $z_0=\phi(x,\gamma(x))$, $y_0=\gamma(x_0)$, and gives via chain rule $$0=\frac{∂\phi}{∂x}(x_0,y_0)+\frac{∂\phi}{∂y}(x_0,y_0)\gamma'(x_0),$$ which can be rewritten to the claim using $\gamma'=\frac{∂y}{∂x}$ and $\frac{∂x}{∂z}=(\frac{∂z}{∂x})^{-1}$.