Derivative of a Weyl tensor expression with orthonormal, smooth components

267 Views Asked by At

Let $(M, g)$ be a Riemannian manifold of dimension $n \geq 4$, and let $W$ be the $(0, 4)$ Weyl tensor on $M$. Let $\{e_i : 1 \leq i \leq n\}$ be an orthonormal frame on $M$, that is, a family of smooth vector fields $e_i: U \to T M$, with $U \subseteq M$ being open and nonempty, that form an orthonormal basis of the tangent space at each point in $U$. Furthermore, let $A(t)$ be an arbitrary family of orthogonal transformations on the tangent space $T M$ smoothly dependent on $t \in \mathbb R$ with the additional property that for $t = 0$, they become the identity function on $T M$.

Let's assume we have

$$W(A(t) e_1, A(t) e_2, A(t) e_3, A(t) e_4) = 0 \tag{1}$$

for all $t \in \mathbb R$. I now want to calculate the derivative of this equation with respect to $t$ at $t = 0$. More precisely, I want to derive the equation

$$W(B e_1, e_2, e_3, e_4) + W(e_1, B e_2, e_3, e_4) + B(e_1, e_2, B e_3, e_4) + W(e_1, e_2, e_3, B e_4) = 0 \tag{2}$$

where $B$ is a skew-symmetric matrix. Furthermore, I want to show that for each skew-symmetric B, there exists an orthogonal matrix A with such a property.

I'm honestly a bit lost here on how to approach this. I'm guessing that at some point, the chain rule for differentiation comes to play, and arriving at a skew-symmetric matrix in the tensor components also seems plausible as the derivative of the transformation $A(t)$ (represented as a matrix) always is $B(t) A(t)$ for some skew-symmetric matrix function $B(t)$ (source), and since we're differentiating in $t = 0$ where $A(t)$ is the identity, we have $A'(0) = B(0) = B$ for some skew-symmetric matrix $B$.

(Although I'm not really sure how we can show that each such skew-symmetric matrix $B$ is already the derivative of an orthogonal $A(t)$ with $A(0) = I$?)

But my main problem is that I don't know what I can do with the Weyl-tensor, and how exactly I can differentiate it here. I've searched the web a bit for tensor derivatives, and the closest I could find was the wiki article about covariant derivates for tensor fields where the formula remotely looks like the sum I'm supposed to get, but my main problem here is that I'm supposed to differentiate with respect to a real variable that's somehow within this tensor expression, so I have no idea if it's the covariant derivative or something else that comes into play here, and if should be the covariant derivative, then how do I get there from differentiating a function with a real-valued parameter $t$ as input.

1

There are 1 best solutions below

3
On BEST ANSWER

Since you're differentiating only with respect to time, we can forget about the spatial dependence and just compute pointwise. That is, fix a point $p \in M$ and evaluate everything at this point, so that $e_i$ becomes an orthonormal basis for the single vector space $T_p M$, $A(t)$ is a family of orthogonal transformations of $T_pM$ and $W : (T_p M)^4 \to \mathbb R$ is a $4$-tensor on the vector space $T_p M$. Thus all the derivatives are simply of functions valued in fixed vector spaces, so we don't need to get covariant derivatives involved.

The key property allowing this product-rule-like expansion is the multilinearity of $W$. The easy way of doing this calculation is using coordinates/components so that we can directly apply the elementary product rule: using the summation convention we can write $$W(Ae_1, Ae_2, Ae_3, Ae_4) = W_{ijkl} (Ae_1)^i (Ae_2)^j (Ae_3)^k (Ae_4)^l.$$ Since the RHS here is literally a sum of products of real-valued functions of $t$, we can apply the product rule to differentiate it. Since $W,e_i$ do not depend on $t$ this yields $$\begin{multline}\frac d{dt}\Big|_{t=0} W(A e_1, \ldots, A e_4)=W_{ijkl}(A' e_1)^i(Ae_2)^j(Ae_3)^k(Ae_4)^l + \cdots \\+ W_{ijkl}(A e_1)^i(Ae_2)^j(Ae_3)^k(A'e_4)^l\end{multline}$$ where $'$ denotes $d/dt$. (Hiding in here we used the fact that e.g. $(Ae_1)' = A'e_1$ for a fixed vector $e_1$: once again, this becomes clear if you write it in components as $(Ae_1)^i = A^i_a e_1^a$.) Since $A(0)$ is the identity, letting $B = A'(0)$ we can write this as $$W_{ijkl}(Be_1)^ie_2^je_3^ke_4^l + \ldots + W_{ijkl}e_1^i e_2^je_3^k(Be_4)^l,$$which we recognize as summation notation for $W(Be_1,e_2,e_3,e_4) + \ldots W(e_1,e_2,e_3,Be_4)$ as desired.

You could do this calculation without using components by using the limit definition of the derivative and following the usual proof of the product rule - the multilinearity of $W$ will allow you to emulate the "splitting" trick $$(fg)(t) - (fg)(0) = f(t)(g(t) - g(0)) + g(0)(f(t) - f(0)).$$

As to the correspondence between orthogonal families $A$ and skew-symmetric generators $B$, the key is the matrix exponential: given a skew-symmetric matrix $B$, you should be able to prove that the family $A(t) = e^{tB}$ is orthogonal and satisfies the desired properties at $t=0$. This is an example of the Lie group-Lie algebra correspondence, here between the group of rotations $SO(n)$ and the algebra of infinitesimal rotations $\mathfrak{so}(n)$.