Solution of system of equations involving contravariant tensor is covariant tensor

40 Views Asked by At

I have an exercise in my linear algebra book as follows. Einstein notation is implied.

A set of quantities $S_{ij}$ is defined in every coordinate system as the solution of the system of equations $T^{ik}S_{ij}=\delta^k_j,$ where $T^{ik}$ is a contravariant tensor of order two and $\det \|T^{ik}\| \neq 0.$ Show that $S_{ij}$ is a covariant tensor of order two.

Notation:$P$ is the change of basis matrix and $Q$ is the inverse matrix. The element $p^i_j$ is the element appearing in row $i$ and column $j$ of $P.$ The term $T^{i'k'}$ is the term $T^{ik}$ in the new coordinates.

I believe that showing $T^{i'k'}p^i_{i'}p^j_{j'}S_{ij}=\delta_{j'}^{k'}$ suffices. We know that $T^{i'k'}=q_i^{i'} q_k^{k'} T^{ik}.$ Substitution gives $q_i^{i'} q_k^{k'} T^{ik}p^i_{i'}p^j_{j'}S_{ij}$ and we can write this with summation notation as $$(\sum_{i=1}^n \sum_{k=1}^n q_i^{i'} q_k^{k'} T^{ik})(\sum_{i=1}^n \sum_{j=1}^n p^i_{i'}p^j_{j'}S_{ij})=\sum_{i=1}^n \sum_{j=1}^n \sum_{k=1}^n \sum_{l=1}^n q_i^{i'}p_{i'}^k q_j^{k'}p_{j'}^l T^{ij}S_{kl}.$$ I don't know where to go from here. One thing I found was using Cramer's rule, letting $T$ denote the determinant of the transposed matrix of $T^{ik}$ and $T_i(e_j)$ be the determinant of the matrix obtained from the transposed matrix of $T^{ik}$ by replacing the $i$th column with $e_j$, $S_{ij}=T_i(e_j)/T.$ I don't see how to use this here though.

1

There are 1 best solutions below

5
On

Too long for a comment.

If $T$ and $S$ are inverse to each other and the change of basis matrices $P$ and $Q$ as well then it is trivial that $$ I=TS=P^\top T\,P\,Q\,S\,Q^\top\,. $$ Now write this in index notation: $$ {\delta^k}_j=T^{ki}S_{ij}={P_t}^{k'}T^{tu}\,\underbrace{{P_u}^{v'}\,{Q_{v'}}^w}_{\textstyle{{\delta_u}^w}}\,S_{wx}\,{Q_{j'}}^x={\delta^{k'}}_{j'}\,. $$ While it is a good idea to learn index notation whose simple rule is:

  • sum over indices that appear twice and ditch the ugly $\displaystyle\Sigma$ with all its limits (since we always sum over $i,j,k,\dots=1,...,n\,.$)

I do not think it is a good idea to think about tensors in terms of matrices and linear algebra. We learn zero from calling $T$ contravariant and $S$ covariant. These terms become important when we look at tensor fields and how their components transform under arbitrary coordinate changes.