Are there contra variant and covariant vectors or are there only contra variant and covariant basis for vectors and these basis transform into further contra variant and covariant basis respectively according to certain co-ordinate transformation laws, keeping the vector as it is, which can be viewed as an arrow ?
2026-04-11 21:54:43.1775944483
Contravariant and Covariant Vectors
440 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in VECTORS
- Proof that $\left(\vec a \times \vec b \right) \times \vec a = 0$ using index notation.
- Constrain coordinates of a point into a circle
- Why is the derivative of a vector in polar form the cross product?
- Why does AB+BC=AC when adding vectors?
- Prove if the following vectors are orthonormal set
- Stokes theorem integral, normal vector confusion
- Finding a unit vector that gives the maximum directional derivative of a vector field
- Given two non-diagonal points of a square, find the other 2 in closed form
- $dr$ in polar co-ordinates
- How to find reflection of $(a,b)$ along $y=x, y = -x$
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
A good analogy is that vectors behave like $n \times 1$ column matrices and covectors behave like $1 \times n$ row matrices. The analogy becomes even stronger when you realize the transpose operation happens to be the metric transpose, for the specific case where the metric is the dot product. (in particular, if you don't have a metric, then you really, really shouldn't be thinking of "column vectors" and "row vectors" as being the same thing)
The transformation laws can be seen easily in this picture too; a change of basis matrix $B$ transforms vectors like $v \mapsto Bv$ and covectors like $\omega \mapsto \omega B^{-1}$ (so that their product remains the same: $\omega v \mapsto \omega B^{-1} B v = \omega v$).
Going back to multivariable calculus, a vector function of one variable is a function whose values are $m \times 1$ matrices. However, the derivative of a scalar-valued function of $n$ variables is a $1 \times n$ matrix. Similarly, a vector-valued function of $n$ variables has a derivative that is an $m \times n$ matrix.
Note that the dot product isn't involved at all when computing a directional derivative of a scalar function: you can simply multiply the row vector $\mathrm{d}f$ by the column vector $v$ to get
$$ \nabla_v f = (\mathrm{d}f) v$$
The usual formula for the directional derivative in terms of the dot product is an artifact of mistakenly treating $\mathrm{d}f$ as another vector (rather than a covector), and so you have to use the dot product to undo that mistake.
Okay it's not really a mistake, per se: I imagine many would find multivariable calculus even more complicated if they were trying to learn the difference between vector and covector at the same time they're trying to learn calculus, so calculus texts just transpose all the covectors to become vectors to avoid the issue.
Of course, that comes at the cost of obscuring the issue when it comes time to actually learn it. I was lucky and figured out the row vector/column vector distinction on my own when I took calculus. Personally, I found it easier to keep everything straight in my mind when I did so.
Incidentally, if you're looking for a graphical depiction of a covector, you need to imagine it in relation to how you think of functions, not points. A vector, in some sense, is something that tells you how to move from one point to another. A covector, however, is something that tells you how a function varies.
I, personally, like to think of functions algebraically rather than geometrically. e.g. the covector field that is the derivative of $x^2 y$ I like to see as the algebraic formula $2xy \mathrm{d}x + x^2 \mathrm{d} y$ rather than as some sort of geometric picture.
I've seen people who like to visualize functions by looking at their level curves (i.e. the curves defined by $f(\mathbf{x}) = a$ for various $a$), extend this visualization to covectors, by imagining them as an infinitesimal element of the level curve (in the same way one might imagine a vector as an infinitesimal element of a curve). Of course, in $n$ dimensions, we need the $(n-1)$-dimensional analogs, rather than curves.