A seemingly counterintuitive result on active and passive transformations of vectors

253 Views Asked by At

Let $\mathbf{v}$ be an element of a vector space with Euclidean $R^3$ as the underlying set. Assume the standard Cartesian basis $\{\mathbf{e^{(1)}, e^{(2)}, e^{(3)}}\}$ on it.

Let $\mathbf{v^* = R v}$ be the vector obtained by an (active) transformation of $\mathbf{v}$ by the linear map $\mathbf{R}$. Assume that $\mathbf{R}$ is invertible (though this is not necessary).

In the standard basis, with summation implied (there is no need to distinguish covariant / contravariant components in Cartesian) $$ v^*_i = R_{ij}v_j $$

Now consider the action of another linear map $\mathbf{S}$ on the vectors of the standard basis, i.e., let $$ \mathbf{e'^{(i)} = S e^{(i)}} $$ This (passive) transformation yields a basis in which the components of $\mathbf{v}$ are given by $$ v'_i = \beta_{ij} v_j $$ where $$ \beta_{ij} = \mathbf{e'^{(i)} \cdot e^{(j)}} $$

Now, when I tried to find out the conditions on the linear maps $\mathbf{R}$ and $\mathbf{S}$ under which the components $v^*_i$ and $v'_i$ are numerically the same, I got the following counterintuitive sufficient condition.

$$ \mathbf{S = R^T} $$

This is really perplexing, since I expected $\mathbf{S = R^{-1}}$. For example, for simple rotations, taking the inverse makes perfect sense - a rotation of the vector and rotation of the basis in the opposite direction give $v^*_i = v'_i$. Of course, in this case $R^{-1} = R^{T}$. But thinking of scaling and other transformations, it is hard to imagine why one would use $R^T$ on the basis to "undo" the effects of $R$ on the vector.

Why is this so?

Thank you.

(I am reasonably sure I didn't commit a blunder in arriving at $\mathbf{S = R^T}$ and also checked a few cases numerically. My apologies if this is trivial.)

1

There are 1 best solutions below

1
On BEST ANSWER

We have (where $(\cdot, \cdot)$ denotes the inner product) \begin{align*} Sv &= \sum_i (Sv, e_i)e_i\\ &= \sum_i \left(\sum_j v_je'_j, e_j\right)e_i\\ &= \sum_i \left(\sum_j v_j(e'_j, e_j)\right)e_i\\ &= \sum_i \left(\sum_j v_j\beta_{ji}\right)e_i \end{align*} So $(Sv)_i = \beta_{ji}v_j$, that is $R= S$.