Can you please help on the mixed tensor definition in my textbook?

627 Views Asked by At

In my textbook it says:

A mixed tensor $r$ times covariant and $s$ times contravariant is a multilinear functional $W$

$$W(\alpha_1 , ... \alpha_r , \vec v_1 , ... , \vec v_s)=(\alpha_1)_{i_1} ...(\alpha_s)_{i_s}{W_{j_1 ... j_r}^{i_1 ... i_s}} \vec v_1^{j_1}...\vec v_r^{j_r}$$

I was hoping someone can make this equation clear to me, at first I'm not even sure why the inputs are what they are I thought it would have been

$W(\vec v_1,...,\vec v_r, \alpha_1,...,\alpha_s)$

And I have no clue what the right hand side is, I have tried digesting it but I just don't have a clue.

Would appreciate the help, thanks.

enter image description here

1

There are 1 best solutions below

3
On BEST ANSWER

According to your text book, a mixed tensor $r$-$s$ takes $r$ 1-forms and $s$-vectors and gives you a scalar, a real number for example.

The right-hand-side is the expression of $W$ acting over $\alpha$'s and $\vec{v}$'s in coordinates (there are several sums ommited). Each $(\alpha_a)_i$ is the $i$-th component of the 1-form $\alpha_a$. Similarly, $\vec{v}_b^j$ is the $j$-th of the vector $\vec{v}_b$. $W_{j_1\dots j_r}^{i_1\dots i_s}$ are the components of $W$.

You can thought $W$ as a generalization of a matrix. A matrix $A$ takes a 1-form $\alpha = (\alpha_1\ldots \alpha_n)$ and a vector $\vec{v}=(v_1\dots v_n)^T$ and gives you a scalar

$$A(\alpha,v) = \alpha A \vec{v} = \sum_{i,j} A_i^j \alpha_j v^i. $$

Or using Einstein's convention

$$A(\alpha,v) = A_i^j \alpha_j v^i $$

(repited indices up and down sum). In this case $A_i^j$ are the components of the matrix

$$A=\begin{pmatrix} a_1^1 & \dots & a_1^n \\ \vdots & \ddots & \vdots \\ a_n^1 & \dots & a_n^n \end{pmatrix} $$

and $A(\alpha,v)$ is just the matrix multiplication

$$ \begin{pmatrix} \alpha_1 & \cdots &\alpha_n \end{pmatrix} \begin{pmatrix} a_1^1 & \dots & a_1^n \\ \vdots & \ddots & \vdots \\ a_n^1 & \dots & a_n^n \end{pmatrix} \begin{pmatrix} v^1 & \dots & v^n \end{pmatrix} . $$

For higher dimensions is the same, but you can' write $W$ as a matrix because it is $r\cdot s$-dimensional.