Linear algebra identity evaluation

43 Views Asked by At

I really couldn't find anything related to this simple identity I came up with so:

$$\vec{r}=(r_x,r_y)=(r_x, \angle0)+(r_y,\angle\frac{\pi}{2})$$

My thinking process was that $r_y$ is practically the modulus of a vector in the Y axis (or with $\theta=90°=\frac{\pi}{2}$) and $r_x$ is the modulus of a vector in the X axis (or with $\theta=0°=0$).

Is this true? I'm sorry if this is obvious or this question is repeated or easy to look up; I'm new to linear algebra and I don't know what to expect or how to search for stuff.

1

There are 1 best solutions below

0
On BEST ANSWER

The statement $$\vec{r}=(r_x,r_y)=(r_x, \angle0)+(r_y,\angle\frac{\pi}{2})$$ Is equivalent to the statement that $$ \mathbf{r}=\begin{bmatrix}r_1 \\ r_2 \end{bmatrix}= r_1\begin{bmatrix}1 \\ 0 \end{bmatrix} + r_2\begin{bmatrix}0 \\ 1 \end{bmatrix} = r_1 \mathbf{\hat i} + r_2\mathbf{\hat j} $$ where $\mathbf{\hat i}$ and $\mathbf{\hat j}$ are the standard basis of $\mathbb R^2$. You will note that any $n$-dimensional vector can be expressed as a linear combtination of some $n$ independent basis vectors. In fact, each component $v_i$ of a vector can be conceptulized as the factor by which the $i$th basis vector of a vector space must be scaled when producing the vector.

This property is true for greater dimensions that $2$, and can be extended to vector spaces besides $\mathbb R^n$. This concept is fundamental to understanding linear algebra. Matrices, for instance, can be understood as linear transformations of space, where the $i$th column of the matrix describes where the $i$th basis vector "lands" after moving through the transformation.