If $A$ is a matrix and $B$ is a tensor (for example 3 by 3/rank 2, and with the same components) and $v$ is a 3 by 1 vector,
- Is there any difference between $A.v$ and $B.v$(in terms of the formula to compute them), where $.$ is the dot product. Are they both a normal "matrix multiplication" which result a 3 by 1 vector?
Thank you for your help.
Depending on uses of indexations.
If $A$ is used to represent a linear transformation then ones could use $$w^k=A_s{}^kv^s,$$ to get $n$ quantities (since $1\le k\le n$) for the components of $w$, from those of $v$.
Or for a bilinear map $B$, where two vectors $u,v$ are paired via $$B_{st}u^sv^t,$$ to assign a number.
The repetition of one index above and one below is to indicate summation (The Einstein's Sum Convention).
However other indexation's conventions could be employed.