There a lot of vector identities which have been proved by writing components in one special coordinate system but how can be sure that it's true in other coordinate systems? Should we prove in each coordinate systems separately? If not, what allows us to believe it's true in all of the coordinate systems?
2026-04-03 17:49:01.1775238541
Proving vector identities in general
239 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in VECTORS
- Proof that $\left(\vec a \times \vec b \right) \times \vec a = 0$ using index notation.
- Constrain coordinates of a point into a circle
- Why is the derivative of a vector in polar form the cross product?
- Why does AB+BC=AC when adding vectors?
- Prove if the following vectors are orthonormal set
- Stokes theorem integral, normal vector confusion
- Finding a unit vector that gives the maximum directional derivative of a vector field
- Given two non-diagonal points of a square, find the other 2 in closed form
- $dr$ in polar co-ordinates
- How to find reflection of $(a,b)$ along $y=x, y = -x$
Related Questions in VECTOR-ANALYSIS
- Does curl vector influence the final destination of a particle?
- Gradient and Hessian of quadratic form
- Regular surfaces with boundary and $C^1$ domains
- Estimation of connected components
- Finding a unit vector that gives the maximum directional derivative of a vector field
- Gradient of transpose of a vector.
- Solve line integral
- Directional derivative: what is the relation between definition by limit and definition as dot product?
- Chain rule with intermediate vector function
- For which $g$ is $f(x)= g(||x||) \frac{x}{||x||}$ divergence free.
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
In fact, when we're talking about $\mathbb R^n$ or $\mathbb C^n$, we can use the "coordinate-free" language. Let me explain what actually it is. First of all, we can define an abstract $\mathbb R$-vector space. We say that $V$ is a $\mathbb R$-vector space if it is:
1) an abelian group (a set with an associative "addition", i.e. with a map $+: V \times V \to V$ which requires some very intuitive conditions. At first, it must be associative and commutative, i.e. for any $a, b, c \in V$ we have $a + (b + c) = (a + b) + c$ and for any $a, b \in V$ we have $a + b = b + a$ (here and further I'll write $a + b$ instead of $+(a, b)$). Secondly, we want $V$ to have a "neutral element" denoted by $0$. "Neutrality" means that $\forall v \in V$ $0 + v = v$. And the third condition: for any $v \in V$ we have an opposite element, i.e. $\forall v \in V \exists w \in W v + w = 0$. An opposite to $v$ is usually denoted by the symbol $-v$. An example of abelian group is $\mathbb R$ itself (with usual notions of addition, zero element and opposite elements);
2) other property of a $\mathbb R$-vector space $V$ is the existence of the map $\circ: R \times V \to V$. Instead of $\circ(\lambda, v)$ here and further in this text I'll write just $\lambda v$;
3) and the last property: we want some "harmony" between $\circ$ and $+$. More explicitly, $\forall a, b \in V$ the following must be true:
A little remark: elements of $\mathbb R$-vector space are often called "vectors", not just "elements".
That's all!
An important exercise: you can easily check that the usual definition of "multiplication" $\circ$ (namely, $\circ(\lambda, v) = \lambda v$) gives $\mathbb R$ a structure of $\mathbb R$-vector space.
Now if $V$ and $W$ are two $\mathbb R$-vector spaces we can define their "direct sum" $V \oplus W$ as an abelian group of all pairs $(v, w)$ $(v \in V, w \in W)$ with the addition defined by the formula $(a, b) + (c, d) = (a + c, b + d)$. Also we can define $\circ$ for $V \oplus W$: $\lambda(v, w) =(\lambda v, \lambda w)$.
An exercise: you can check that these operations give $V \oplus W$ a structure of $\mathbb R$-vector space.
Since, as mentioned above, $\mathbb R$ is the $\mathbb R$-vector space, we can define $\mathbb R^n$ as a vector space $\mathbb R \oplus \mathbb R \oplus \ldots \mathbb R$ ($n$ times).
Using the "coordinate-free language", we can now define "coordinate system" in $\mathbb R^n$ as a way to choose the base (a system of vectors in $\mathbb R^n$ such that any vector in $\mathbb R^n$ can be uniquely written as their linear combination; coefficients in this combination are, in fact, called "coordinates of v") in $\mathbb R^n$. Since a way to find coordinates of v is unique for every $v \in \mathbb R^n$, we can prove various statements by, firstly, choosing the base, and, secondly, by comparing coordinates of some vectors in this base. If they're the same, we can say that the vectors are also the same as elements of $\mathbb R^n$ (since they're linear combinations of the vectors from the base with the same coefficients). Since the definition of $\mathbb R^n$ doesn't depend on the choice of base, in any other base the coordinates of these vectors also'll be the same (since the vectors are the same!!).
I don't know how clear was this explanation, but, anyway, you can find this stuff in any modern Linear algebra or Abstract algebra textbook for undergrad students...