Question about the proof of a theorem in Differential geometry

129 Views Asked by At

I'm currently reading Michael Spivak's "Calculus on Manifolds" since I need to learn some basics for a Electrodynamics class. The question is about Theorem 4-6, or more precisely about the proof of it.

Theorem

Let $\{v_i\}_{i=1}^n$ be a basis for $V$, and let $\omega \in \Lambda^n(V)$. If $w_i=\sum_{j=1}^n a_{ij}v_j$ are $n$ vectors in $V$, then $$\omega(w_1,\dots, w_n)=\operatorname{det}(a_{ij})\cdot \omega(v_1,\dots,v_n).$$

Proof

Define $\eta\in\mathfrak{I}^n(\mathbb{R}^n)$ by $$\eta((a_{11},\dots,a_{1n}),\dots,(a_{n1},\dots,a_{nn}))=\omega\left(\sum a_{1j}v_j,\dots,\sum a_{nj}v_j\right).$$ $\eta\in\Lambda^n(\mathbb{R}^n)$ so $\eta =\lambda \cdot \operatorname{det}$ for some $\lambda\in\mathbb{R}$ and $\lambda=\eta(e_1,\dots,e_n)=\omega(v_1,\dots,v_n)$.

$\hspace{16cm}\Box$

I really don't understand any part of this proof, so it would be great if someone could explain this in a little bit more detail. Please keep in mind that I'm still not very familiar with the introduced concepts.

Edit: $\mathfrak{I}^n(V)$ is the set of all $n$-Tensors on $V$.

1

There are 1 best solutions below

0
On BEST ANSWER

I assume $V$ is a vector space over $\mathbb R$. I wonder if the notation would be more clear if we introduce the linear transformation $T:\mathbb R^n \to V$ defined by $$ T\left(\begin{bmatrix} c_1 \\ \vdots \\ c_n \end{bmatrix} \right) = c_1 v_1 + \cdots + c_n v_n. $$ If $a_1,\ldots,a_n \in \mathbb R^n$, then $$ \eta(a_1,\ldots,a_n) = \omega(T(a_1),\ldots,T(a_n)). $$ The proof asserts that $\eta$ is multilinear, but let's check this more carefully: \begin{align} \eta(a_1 + a_1',a_2,\ldots,a_n) &= \omega(T(a_1 + a_1'),T(a_2),\ldots,T(a_n)) \\ &= \omega(T(a_1) + T(a_1'),T(a_2),\ldots,T(a_n)) \\ &= \omega(T(a_1),T(a_2),\ldots,T(a_n)) + \omega(T(a_1'),T(a_2),\ldots,T(a_n)) \\ &= \eta(a_1,a_2,\ldots,a_n) + \eta(a_1',a_2,\ldots,a_n). \end{align} (In the second-to-last step we used the fact that $\omega$ is multilinear.) Also, \begin{align} \eta(\alpha a_1,a_2,\ldots,a_n) &= \omega(T(\alpha a_1),T(a_2),\ldots,T(a_n)) \\ &= \omega(\alpha T(a_1),T(a_2),\ldots,T(a_n)) \\ &= \alpha \omega(T(a_1),T(a_2),\ldots,T(a_n)) \\ &= \alpha \eta(a_1,a_2,\ldots,a_n). \end{align} This shows that $\eta$ is linear in its first argument. Similarly, we can show that $\eta$ is linear in each of its arguments.

The proof also asserts that $\eta$ is alternating, so let's check that too. \begin{align} \eta(a_2,a_1,a_3,\ldots,a_n) &= \omega(T(a_2),T(a_1),T(a_3),\ldots,T(a_n)) \\ &=-\omega(T(a_1),T(a_2),\ldots,T(a_n)) \\ &= -\eta(a_1,a_2,\ldots,a_n). \end{align} This shows that interchanging the first two inputs to $\eta$ flips the sign of the output. Similarly, we can show that interchanging the $i$th and $j$th inputs to $\eta$ flips the sign of the output.

So, we have shown in more detail that $\eta \in \Lambda^n(\mathbb R^n)$.

But, the book has shown previously that $\Lambda^n(\mathbb R^n)$ is one-dimensional. The most famous element of $\Lambda^n(\mathbb R^n)$, of course, is the determinant function $\det$. It follows that $$ \tag{$\spadesuit$} \eta = \lambda \det $$ for some scalar $\lambda$.

And what is $\lambda$? Let $e_1,\ldots,e_n$ be the standard basis vectors for $\mathbb R^n$. We can find the value of $\lambda$ by plugging the inputs $e_1,\ldots,e_n$ into both sides of equation ($\spadesuit$): \begin{align} \eta(e_1,\ldots,e_n) &= \lambda \det(e_1,\ldots, e_n) \\ &= \lambda. \end{align} But notice also that \begin{align} \eta(e_1,\ldots,e_n) &= \omega(T(e_1),\ldots, T(e_n) ) \\ &= \omega(v_1,\ldots,v_n). \end{align} This shows that $\lambda = \omega(v_1,\ldots,v_n)$. Thus, from equation ($\spadesuit$), we have $$ \eta(a_1,\ldots,a_n) = \omega(v_1,\ldots,v_n) \det(a_1,\ldots,a_n). $$ In other words, $$ \omega(T(a_1),\ldots,T(a_n)) = \omega(v_1,\ldots,v_n) \det(a_1,\ldots,a_n). $$ This is what we wanted to show.