Expressing conversion of a $(1,1)$ tensor to a $(2,0)$ tensor in terms of matrices

714 Views Asked by At

Here's a bit of background: A non-degenerate Hermitian form $(.|.)$ on a vector space $V$ can be identified with a map $L:V \to V^*$ such that $L(v)=\tilde{v}$ and $\tilde{v}(w) \equiv (v~|~w)$.

Now let's say we have a vector, or a $(1,0)$ tensor, $v$ and we wish to convert this to a dual vector, or a $(0,1)$ tensor, $\tilde{v}$. To clarify the notation, the vector can be written in terms of its components : $v^i$ and the dual vector can be similarly written as $v_i$ (note that we omit the tilde sign here because the subscript automatically implies covarying components, and hence components of a dual vector). So this whole "conversion" operation is effectively lowering of the index ($(1,0)$ to $(0,1)$ tensor).

In terms of matrices, we can just construct the matrix $M$ corresponding to the Hermitian form, and hence the map $L$, by letting $M_{ij} = (e_i~|~e_j)$. So

$$v_j = \tilde{v}(e_j) = (v~|~e_j) = \sum_iv^i(e_i~|~e_j) = \sum_i M_{ij}v^i$$

If $[v]$ and $[\tilde{v}]$ are column vectors containing components of $v$ and $\tilde{v}$, then $[\tilde{v}] = M[v]$.

Now I'm trying to apply this whole treatment to the conversion of a $(1,1)$ tensor to a $(2,0)$ tensor. In component representation, the former can be written as $T_i^{~~j}$ and the latter as $T_{ij}$, so effectively we're trying to lower the upper index. From the book I'm reading:

If we have a non-degenerate bilinear form on $V$, then we may change the type of $T$ by precomposing with the map $L$ or $L^{-1}$. If $T$ is of type $(1,1)$ with components $T_i^{~~j}$, for instance, then we may turn it into a tensor $\tilde{T}$ of types $(2,0)$ by defining $\tilde{T}(v,w) = T(v,L(w))$.

  1. Isn't the operation actually supposed to be $T(v,f) \to T(v, L^{-1}(f))$ for $(1,1)$ to $(2,0)$, just as it was $v \to L(v)$ for $(1,0)$ to $(0,1)$?
  2. How do we construct the matrix (say $H$) corresponding to the $(1,1)$ tensor? I'm confused between $H_{ij} = T(e_i, e^j)$, where $e^j(e_i) = \delta_i^j$, or $H_{ij}=T(e_i, L(e_j))$, because the dual bases $\{e^i\}$ and $\{L(e_i)\}$ are not the same unless $\{e_i\}$ is orthonormal.
  3. How do I get a matrix representation in this case as I did for the above example? Let $H$ be the matrix corresponding to $T$ (not sure how we constructed it). Then

$$T_{ij} = \tilde{T}(e_i, e_j) = \ldots~??$$

How to proceed?

1

There are 1 best solutions below

5
On BEST ANSWER
  1. If $T$ is of type $(1,1)$, then $T\colon V\times V^* \to \Bbb C$ eats one covector and one vector. If we want to define $\widetilde{T}$ of type $(2,0)$, that is, $\widetilde{T}\colon V\times V \to \Bbb C$, we must somehow change the domain of $T$. This is done and undone via $$V\times V^* \xrightarrow{\hspace{.4cm}{\rm Id}_V\times L^{-1} \hspace{.4cm}} V\times V\quad \mbox{and}\quad V\times V \xrightarrow{\hspace{.4cm}{\rm Id}_V\times L \hspace{.4cm}} V\times V^*.$$So $\widetilde{T}(v,w) \doteq T(v, L(v))$ is the correct definition, since you need the second entry of $T$ to be a covector. Conversely, if you begin with $\widetilde{T}$ only, $T$ is given by $T(v,f) = \widetilde{T}(v, L^{-1}(f))$, since you need the second entry of $\widetilde{T}$ to be a vector.

  2. Here, you need to note that the matrix representing the components of the $(1,1)$ tensor must have one upper index and one lower index. With your notation $M_{ij} = (e_i \mid e_j)$, we want to find a relation between $$H_i^{\;j} = T(e_i,e^j) \quad \mbox{and}\quad S_{ij} = \widetilde{T}(e_i,e_j).$$It is common to write $T_i^{\;j}$ and $T_{ij}$ directly instead of using new kernel letters, but I'll stick with this notation, so it becomes easier to understand. Also, I'll denote by $(M^{ij})$ the inverse matrix of $(M_{ij})$ (here, it's also more common to write $g$ instead of $M$). We initially have $$H_i^{\;j} = T(e_i,e^j) = \widetilde{T}(e_i, L^{-1}(e^j)),$$so the first order of business is writing $L^{-1}(e_j)$ in terms of the given basis of $V$. One does this as follows: first check that $\widetilde{e_j} = L(e_j) = \sum_i M_{ji}e^i$, by applying both sides on an arbitrary vector $v = \sum_i v^ie_i$. Apply $L^{-1}$ to get $e_j = \sum_i M_{ji} L^{-1}(e^i)$. Attack both sides with $\sum_kM^{kj}$ to get $L^{-1}(e^k) = \sum_{k}M^{kj}e_j$. Then we can proceed: $$H_i^{\;j} = \widetilde{T}\left(e_i, \sum_k M^{jk}e_k\right) = \sum_{k}M^{jk} \widetilde{T}(e_i,e_k) = \sum_{k} M^{jk} S_{ik}.$$If you want to express $S$ in terms of $T$, multiply everything by $M_{\ell j}$ and sum over $j$ to get $$S_{i\ell} = \sum_{j} M_{\ell j} H_i^{\;j}.$$In the more standard notation, with Einstein's convention and renaming that mute index $\ell$, you have $$\fbox{$T_i^{\;j} = g^{jk}T_{ik}$} \quad\mbox{and}\quad \fbox{$T_{ij} =g_{jk} T_i^{\;k}$}$$

  3. I think I sort of answered that in 2. above.