what are the tensor rule for canceling indices in the same tensor variable?
let's say I have a tensor:
$\Huge V^{rs}_{trs}$
textbook says this tensor is equal to:
$\Huge V^{rs}_{trs} = V_{t}$
Which I could believe because it looks like the same letters cancel each other out, but then again. i'm not really sure what's happening here...what's the general rule for canceling out indices on the same variable? can the indices with the same index variables be in any position in the upper and lower, and if they match can you cancel them out, or do they need to be in specific slots to cancelling the upper and lower indices of the same term to work?
I'm just going to hit this question with a brute force hammar:
true or false? $\huge V^{ij}_{i} = V^j$
true or false? $\huge V^{ji}_{i} = V^j$
true or false? $\huge V_{ji}^{i} = V_j$
true or false? $\huge V_{ij}^{i} = V_j$
true or false? $\huge V_{i}^{i} = V$
true or false? $\huge V_{ii}^j = V^j$
true or false? $\huge V^{ii}_j = V_j$
all you need to do is copy and paste this section and answer true or false. Then I will understand the rule. Thanks.
The answer is true on all accounts. But, there's a caveat... I'll illustrate one example without any deleted steps to show what's happening:
Let's assume this tensor is over $\mathbb{R}^3$:
$\Huge V^{ii}_j = V^{11}_j + V^{22}_j + V^{33}_j = \tilde{V}_j$
Here I'm using the tilde to indicate that $\huge \tilde{V}$ is a different tensor that is the sum of three terms. However, at this point we drop the use of the tilde for this purpose, instead, we implicitly see that:
any tensor with different tensor ranks in the numerator and denominator is implicitly a different variable even enough though the base letter is the same. thus:
$\Huge V^{ii}_j = V_j$
it would be just as vaild to write the contraction this way as well:
$\Huge V^{ii}_j = W_j$
Both are equivalent, but reusing the same letter saves letters and keeps track of where a tensor variable came from.... its a type of "overloading" of variable names...
Here are the rules for contraction.
Contraction of (2,2) tensor
Suppose $\huge A^{i_1~ i_2}_{j_1~ j_2}$ is a tensor.
Assuming $\mathbb{R}^n$, The contraction of A with respect to $i_2$ and $j_1$, is given by setting $i_2 = j_i = u$ and evaluating:
$\huge A^{i_1~ u}_{u~ j_2} = A^{i_1~ 1}_{1~ j_2} + A^{i_1~ 2}_{2~ j_2} + A^{i_1~ 3}_{3~ j_2} =\tilde{A}^{i_1}_{j_2}$
in going from A to $\tilde{A}$ , contravariant rank is reduced by 1, and covariant rank reduced by 1. Thus:
contracting a (2,2) tensors results in a (1,1) tensor.
Contraction of (p, q) tensor
Suppose S is a (p, q) tensor such that $\huge S \equiv \bigg(S^{~i_1~ i_2~ \dots ~i_{\Large p}}_{~j_1~ j_2~ \dots~ j_{\Large q}}\bigg)$. The contraction of S with respect to a contravariant index $i_f$ and a covariant index $j_g$ is given by setting $\huge i_f = j_g = u$ and evaluating:
$\huge S^{~i_1~\dots~u~\dots~i_{\Large p}}_{~j_1~\dots~u~\dots~j_{\Large q}} = \tilde{S}^{~i_1~\dots~i_{\Large p}}_{~j_1~\dots~j_{\Large q}}$
$\tilde{S}$ is now a (p-1, q-1) tensor.
contracting a (p,q) tensors results in a (p-1,q-1) tensor