Dot product in tensor algebra

352 Views Asked by At

I am new to tensor algebra, and to get a grip of how to solve practical tensor problems I am going through this book: https://www.amazon.it/Solutions-Exercises-Principles-Tensor-Calculus/dp/1728857260. In excercise 62 of chapter 3, the author defines the tensors $\mathbf{A}=A^i\mathbf{E}_i$, $\mathbf{B}=B_i\mathbf{E}^i$ and $\mathbf{C}={C^m}_n\mathbf{E}_m \mathbf{E}^n$, and asks to calculate the dot products $\mathbf{B}\cdot\mathbf{B}$, $\mathbf{C}\cdot\mathbf{A}$ and $\mathbf{C}\cdot\mathbf{B}$. He then solves the first case in the following way

$\mathbf{B}\cdot\mathbf{B} = (B_j\mathbf{E}^j)\cdot(B^k\mathbf{E}_k) = B_jB^k(\mathbf{E}^j\cdot\mathbf{E}_k) = B_jB^k{\delta^j}_k = B_jB^j$

and the second one like this

$\mathbf{C}\cdot\mathbf{A} = ({C^m}_n\mathbf{E}_m \mathbf{E}^n)\cdot(A^i\mathbf{E}_i) = {C^m}_nA^i\mathbf{E}_m(\mathbf{E}^n\cdot\mathbf{E}_i) = {C^m}_nA^i\mathbf{E}_m{\delta^n}_i = {C^m}_iA^i\mathbf{E}_m$

What puzzles me here is that in the first case the author inverted the type of the second tensor, while he doesn't do that in the second case. From what I understood of tensor inner products, the $\mathbf{C}\cdot\mathbf{A}$ solution should be correct, while I would have solved $\mathbf{B}\cdot\mathbf{B}$ in the following way

$\mathbf{B}\cdot\mathbf{B} = (B_j\mathbf{E}^j)\cdot(B_k\mathbf{E}^k) = B_jB_k(\mathbf{E}^j\cdot\mathbf{E}^k) = B_jB_kg^{jk}$

Am I wrong? If yes, what am I missing?

1

There are 1 best solutions below

0
On

I answer this question in two ways. First for the case where the components of column vectors $\mathbf{x}$ and $\mathbf{y}$ transform with a contravariant transformation law. Second for the case where the components of row vectors $\mathbf{x}$ and $\mathbf{y}$ transform with a covariant transformation law.

Components of column vectors $\mathbf{x}$ and $\mathbf{y}$ transform with contravariant transformation law

I choose the vector space $\mathbb{R}^n$ over the field $\mathbb{R}$ together with the inner product. I choose a basis $\mathbf{f}$ where $\mathbf{f} =\left(\mathbf{e}_1,\cdots\mathbf{e}_n\right)$ that spans $\mathbb{R}^n$. I write two vector $\mathbf{x},\mathbf{y}\in \mathbb{R}^n$ with uniquely as $\mathbf{x}=x^{i_1}\,\mathbf{e}_{i_1}$ and $\mathbf{y}=y^{i_2}\,\mathbf{e}_{i_2}$, respectively. Note that the $x^i$ and $y^i$ are not the components of the vector in $\mathbb{R}^n$, but are basis dependent. In this vector space there exists a symmetric positive-definite matrix $\mathbf{G}$ such that the function $\left<\cdot,\cdot \right> : \mathbb{R}^n\times\mathbb{R}^n\rightarrow \mathbb{R} $ defined by the rule $$\left< \begin{bmatrix} x_1 \\ \vdots \\ x_n \end{bmatrix}, \begin{bmatrix} y_1 \\ \vdots \\ y_n \end{bmatrix}\right> = g_{i_1,i_2}\, x^{i_1 }\,y^{i_2 } ,\quad \text{ where}\quad g_{i_1,i_2} = \mathbf{e}_{i_1 }\cdot \mathbf{e}_{i_2} \,. $$

Components of row vectors $\mathbf{x}$ and $\mathbf{y}$ transform with covariant transformation law

I choose the vector space $\mathbb{R}^n$ over the field $\mathbb{R}$ together with the inner product. I choose a basis $\mathbf{f}$, where $\mathbf{f} =\left(\mathbf{e}^1,\cdots\mathbf{e}^n\right)$ spans $\mathbb{R}^n$. I write two vector $\mathbf{x},\mathbf{y}\in \mathbb{R}^n$ with uniquely as $\mathbf{x}=x_{i_1}\,\mathbf{e}^{i_1}$ and $\mathbf{y}=y_{i_2}\,\mathbf{e}^{i_2}$, respectively. In this vector space there exists a symmetric positive-definite matrix $\mathbf{G}$ such that the function $\left<\cdot,\cdot \right> : \mathbb{R}^n\times\mathbb{R}^n\rightarrow \mathbb{R} $ defined by the rule $$\left< \begin{bmatrix} x_1 & \cdots & x_n \end{bmatrix}, \begin{bmatrix} y_1 & \cdots & y_n \end{bmatrix}\right> = g^{i_1,i_2}\, x_{i_1 }\,y_{i_2 } ,\quad \text{ where}\quad g^{i_1,i_2} = \mathbf{e}^{i_1 }\cdot \mathbf{e}^{i_2} \,. $$