why does tensor rotation require multiplication by the rotation matrix twice, once from the right and once from the left by the inverse?
if $T$ is the tensor I wish to rotate and $R$ is the rotation matrix, why isn't $T'=RT$ but is $T=RTR^{-1}$?
I have seen and understood the construction of tensor transformation, but I am intuitively uncomfortable with it.
thank you
Consider the following example
Let's say we have a certain vector v that we would like to first transform somehow and then rotate.(Lets say that the transformation we want to apply to v is a simple scaling, which is simpler to describe wrt to orthodox v i.e. before we rotate it) This leads us to having two options for the operations:
Doing this would yield 2 different results, in order to avoid this if we go down the 2nd pathway we would have to adjust our transformation matrix T by "rotating" it i.e. in order to get the same result from the first pathway RTv = b we would have to adjust our transformation T to T' s.t. T'Rv = b which leads to the simple description of how we should adjust T' i.e.
RT = T'R
multiply on the right with R^(-1) would yield the following relationship between the transformed T tensor (T') and the original one T
RTR^(-1) = T'