A determinant invariant with respect to a change of metric (covariant determinant?)

63 Views Asked by At

I use a multi-vector of G(2,R):

$$ \mathbf{u} = a+x\sigma_x+y \sigma_y + b \sigma_x\sigma_y $$

its matrix representation is

$$ \mathbf{u} \cong \pmatrix{a +x & y-b \\y+b & a-x} $$

and the determinant is $\det (\mathbf{u})=a^2-x^2-y^2+b^2$

I am now trying to create a "covariant" determinant.

I wish to transform the orthogonal basis to general curvilinear coordinates:

$$ \sigma_x \to e_0\\ \sigma_y \to e_1 $$

such that $e_\mu \cdot e_\nu = g_{\mu\nu}$.

If I do, however, the determinant is no longer equal to $a^2-x^2-y^2+b^2$, because the multi-vector is now $\mathbf{u} = a+xe_0+y e_1 + b e_0e_1$.

Is there a covariant definition of the determinant?

2

There are 2 best solutions below

2
On

$$\mathbf u=\langle\mathbf u\rangle_0+\langle\mathbf u\rangle_1+\langle\mathbf u\rangle_2=a+xe_0+ye_1+b\,e_0\wedge e_1$$ $$\overline{\mathbf u}=\langle\mathbf u\rangle_0-\langle\mathbf u\rangle_1-\langle\mathbf u\rangle_2=a-xe_0-ye_1-b\,e_0\wedge e_1$$ $$\det\mathbf u=\langle\overline{\mathbf u}\mathbf u\rangle_0=\langle\mathbf u\rangle_0\!^2-\langle\mathbf u\rangle_1\!^2-\langle\mathbf u\rangle_2\!^2=a^2-(xe_0+ye_1)^2-(b\,e_0\wedge e_1)^2$$ $$=a^2-x^2e_0\!^2-xy(e_0e_1+e_1e_0)-y^2e_1\!^2-b^2((e_0e_1-e_1e_0)/2)^2$$ $$=a^2-x^2g_{00}-xy(2g_{01})-y^2g_{11}-b^2(g_{01}\!^2-g_{00}g_{11})$$

0
On

This is impossible. Consider the $G(1,\mathbb R)$ case for simplicity. What you want is $a^2 + b^2 = c^2 + d^2$ where $a + be_0 = c + de_0'$ for any two bases $e_0$ and $e_0'$. But take $a = b = 1$ and $e_0' = e_0/2$. Then $c = 1$ and $d = 2$, and then you're saying you want $$ a^2 + b^2 = c^2 + d^2 $$$$ 1^2 + 1^2 = 1^2 + 2^2 $$$$ 2 = 5. $$

In a covariant expression in Einstein notation like $a_ia^i$, we aren't just using the coordinates of a vector $a$, but also its reciprocal coordinates. This is necessary for forming covariant expression. Following the same idea, perhaps this is what you are looking for:

Let $\{e_i\}$ be an arbitrary curvilinear basis with $\{e^i\}$ its reciprocal basis, i.e. $e_i\cdot e^j = \delta_i^j$. Of course, $e^i = e_jg^{ji}$. Then in $G(4,\mathbb R)$ $$ \mathbf u = u + u^ie_i + \frac12u^{ij}e_i\wedge e_j = u + u_ie^i + \frac12u_{ij}e^i\wedge e^j $$ where $u_{ij}$ and $u^{ij}$ are antisymmetric. Then $$ \det\mathbf u = \langle\bar{\mathbf u}\mathbf u\rangle = \Bigl\langle(u - u^ie_i + \frac12u^{ij}e_i\wedge e_j)(u + u_ke^k + \frac12u_{kl}e^k\wedge e^l)\Bigr\rangle $$$$ = uu - u^iu_ke_i\cdot e^k - \frac14u^{ij}u_{kl}(e_i\wedge e_j)\cdot(e^k\wedge e^l) $$$$ = uu - u^iu_i - \frac14u^{ij}u_{kl}(\delta_j^k\delta_i^l - \delta_j^l\delta_i^k) $$$$ = uu - u^iu_i - \frac14(u^{ij}u_{ji} - u^{ij}u_{ij}) $$$$ = uu - u^iu_i + \frac12u^{ij}u_{ij}. $$ In the last line we've used $u_{ji} = -u_{ij}$. Note that this final expression also comes simply from writing $\langle\bar{\mathbf u}\mathbf u\rangle$ as squares of grade components and writing those in terms reciprocal coordinates.

All in all, I would prefer $\langle\bar{\mathbf u}\mathbf u\rangle$ as a "covariant" expression, since you can evaluate in any combination of bases you please, or in a coordinate-free manner.