Basic question (clarification) on tensor notation of a matrix: Up-Down or Down-Down?

307 Views Asked by At

Easy to muddle through while reading along, but a recurrent source of frustration for me (Wikipedia reference):

$A_{ij}$ would seem to suggest that we are writing $A_{ij}\;e^i\otimes e^j$. But this makes no sense, since a matrix would "eat" one vector and one covector.

On the other hand $A^{i}{}_{j}$ would make more sense interpreted as an object $A^{i}{}_{j} \;e_i \otimes e^j$ that would take in one vector and one covector. In the case of a vector in Euclidean coordinates:

$$\begin{align} A^{i}{}_{j}v^j &= u^i\\[2ex] \text{or}\\[2ex] \left( A^{i}{}_{j} \;e_i \otimes e^j \right) v^j e_j &= u^i e_i \end{align}$$

Likewise, the matrix product can be expressed as

$$\begin{align} A^{i}{}_{j} B^{j}{}_k &= C^{i}{}_k\\[2ex] \text{or}\\[2ex] A^{i}{}_{j} \;e_i \otimes e^j \, B^{j}{}_k \; \;e_j \otimes e^k &= C^{i}{}_k \;e_i \otimes e^k \end{align}$$

So are $A_{ij}$ and $A^{i}{}_{j}$ at all compatible and equivalent, or is $A_{ij}$ misleading when one considers the basis vectors?


Context:

I am confused by this passage in page 9 of Introduction to Tensor Analysis by H.D. Block:

enter image description here

If

$$ {\bf{\bar e_j}} = a^i{}_j {\bf{ e_i}}$$

is the $j$-th entry in a vector in the new basis, and the transformation matrix is

$$A=\begin{bmatrix} a^1{}_1 & a^1{}_2 & \cdots & a^1{}_n\\ a^2{}_1 & a^2{}_2 & \cdots & a^2{}_n\\ \vdots & \vdots & \ddots & \vdots\\ a^n{}_1 & a^n{}_2 & \cdots & a^n{}_n\\ \end{bmatrix}$$

I can't see the notation in

$$ {\bf{\bar e_*}} = A^\top {\bf{ e_*}}$$

3

There are 3 best solutions below

4
On

${A^i}_j$ is what you must write if you are using Einstein notation. $A_{ij}$ is only what normal mortals use.

0
On

You have four maps, all equivalent under index-raising/lowering:

  • $A \doteq A^{\rm operator}: V \to V$. This is described by $Ae_j = A^i_je_i$
  • $A^{(1,1)}\colon V^* \times V \to \Bbb R$, given by $A^{(1,1)}(f,v) = f(Av)$. This is described by $A^{(1,1)} = A^i_j e_i \otimes e^j$.
  • $A^{(0,2)}\colon V \times V \to \Bbb R$, given by $A^{(0,2)}(v,w) = \langle Av,w\rangle$. This is $A^{(0,2)} = A_{ij}e^i\otimes e^j$.
  • $A^{(2,0)} \colon V^*\times V^* \to \Bbb R$, given by $A^{(2,0)}(f,g) = A^{(0,2)}(f^\sharp, g^\sharp)$, where $\sharp\colon V^* \to V$ is induced by the metric. This is $A^{(2,0)} = A^{ij} e_i\otimes e_j$.
0
On

Acknowledging Misha Lavrov comment-style answer, I would be remiss not to share the take on this in Introduction to Tensor Analysis and the Calculus of Moving Surfaces by Pavel Grinfeld:

The components of a matrix $\bf A$ could certainly be written as $A^{i}{}_{j}\;e^i\otimes e_j$.

In this case a matrix transformation of a vector $\bf u,$ as in $\bf v= Au,$ could be (pedantically) written as

$$v^i = A^{i}{}_{j} u^j$$

where the offset between the upstairs index $i$ in $A^{i}{}_{j},$ and the downstairs index $j$ indicates that the contraction with the components of $\bf u,$ i.e. $u^j$ is along the second index of $\bf A,$ i.e. the columns, as expected in the multiplication of a matrix times a vector. This assumption in the order of the components is explained here:

The values of a variant in a given coordinate system form an indexed set of numbers. How one chooses to organize these numbers into a table is a matter of convention. For first- and second-order systems, we adopt a convention that is consistent with that of linear algebra. A variant of order one, $T^i$ or $T_i$, is represented by a column. The flavor of the index (covariant or contravariant) has no bearing on the representation. A variant of order two, $T_{ij},$ $T_j{}^{i}$ , or $T ^{ij}$, is represented by a matrix in which the first index indicates the row and the second index indicates the column. Thus, it is important to clearly identify the order of the indices in a variant. For a variant of order higher than two, there is no single commonly accepted convention. For variants of order three, one may imagine a three-dimensional cube of numbers where the first index indicates the slice and the other two indices indicate the row and column within the slice.

Similarly,

$$\begin{align} {\bf{AB=C}}\\[2ex] =\sum_k A_{ik}B_{kj}&=C_{ij}\\[2ex] \underset{\text{tensor notation}}=A^i{}_kB^k{}_j&=C^i{}_{j}\\[2ex] \underset{\text{components!}}=B^k{}_jA^i{}_k&=C^i{}_{j} \end{align}$$

indicating the contraction of the second index of $\bf A,$ along the direction of the columns, with the first index of $\bf B,$ along the direction of its rows. The order of the expression wouldn't change the operation, since we are referring to the same entries, not the actual matrix.

On the other hand,

$$A_k{}^i B^k{}_j=C_j{}^i$$

would be the equivalent of $\bf A^\top B=C^\top$.