Mathematically rigorous way to describe two vectors that have the same components but are not equal.

265 Views Asked by At

In linear algebra, we're taught that vectors can be represented as the addition of a scalar multiplication of the basis vectors. For instance, if $\hat \imath, \hat \jmath$ are the unit vectors, $$\vec v = \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} = v_1\hat \imath + v_2\hat \jmath$$ Is it possible that some vector $\vec v_*$ also be equal to $\begin{bmatrix} v_1 \\ v_2 \end{bmatrix}$ but that their expression in terms of the basis makes it such that $\vec v \neq \vec v_*$, for example if the latter has a different basis s.t. $\vec v_* = v_1\hat r + v_2\hat p$? What exactly is the proper description of this situation? I don't think it is enough to say that they have different bases, since maybe we could build one basis from the other. Are they failing to share the same vector space, or something of the sort? Do we have to declare some property of $\vec v_*$ from the beginning?

I'm thinking this situation can arise if the vectors use different coordinate systems, but how can I describe this?

2

There are 2 best solutions below

5
On BEST ANSWER

vectors can be represented as the addition of a scalar multiplication of the basis vectors.

I think what you mean is that a vector can be expressed as a linear combination of basis vectors (if a basis for our vector space has been given).

Let $\beta_1 = (\hat i, \hat j)$ be an ordered basis of $\mathbb R^2$, and let $\beta_2 = (\hat r, \hat p)$ be some other ordered basis of $\mathbb R^2$. If a vector $v$ can be expressed as $$ v = v_1 \hat i + v_2 \hat j $$ then the coordinate vector of $v$ with respect to the basis $\beta_1$ is $\begin{bmatrix} v_1 \\ v_2 \end{bmatrix}$. Another way to write this is: $$ [ v ]_{\beta_1} = \begin{bmatrix} v_1 \\ v_2 \end{bmatrix}. $$ And if $v^* = v_1 \hat r + v_2 \hat p$, then the coordinate vector of $v^*$ with respect to $\beta_2$ is $\begin{bmatrix} v_1 \\ v_2 \end{bmatrix}$.

Of course, this does not mean that $v = v^*$. All it means is that the coordinate vector of $v$ with respect to the basis $\beta_1$ happens to be the same as the coordinate vector of $v^*$ with respect to $\beta_2$.

4
On

Given a vector space $V$ and let a vector $v\in V$.

Let there are two ordered basis sets $U=\{u_1,u_2,...u_n\}$ and $W=\{w_1,w_2,...w_n\}$ and let the representation of $v$ in ordered basis $U$ and $W$ be $$[v]_U=\begin{bmatrix} c_1 \\ c_2 \\.\\.\\.\\c_n \end{bmatrix}$$ and $$[v]_W=\begin{bmatrix} c'_1 \\ c'_2 \\.\\.\\.\\c'_n \end{bmatrix}$$

Which basically means $v=c_1u_1+c_2u_2+...c_nu_n$ and $v=c'_1w_1+c'_2w_2+...c'_nw_n$

Now, you are asking the relation between $[v]_U$ and $[v]_W$.

Now, $$[v]_W=A_{U\rightarrow W}[v]_U$$ where $A_{U\rightarrow W}$ is a matrix called the transition matrix and $(A_{U\rightarrow W})^{-1}=A_{W\rightarrow U}$

PS: I can discuss the method of finding the transition matrix if you want it.