Intuition for Euclidean Norm of Vector Field in Riemannian Space

350 Views Asked by At

Suppose we are on a Riemannian manifold $(\mathbb{R}^n,g)$, where we have a vector field $v$ (i.e. $v^i(x)\partial_i$). We can assume we are using standard Cartesian coordinates $x^i$, so $(\hat{i},\hat{j},\ldots)$ form a basis of the tangent space everywhere, but $g$ is a complicated function of space.

I am interested in the conditions under which $v=0$ everywhere. Naturally, a way to express this in a single equation is: $$ v(x) = \vec{0}\;\;\;\iff\;\;\; \sum_i (v^i(x))^2 = 0 \;\;\;\iff\;\;\; ||v(x)||^2_E = 0 $$ $\forall x$, where $E$ is the Euclidean 2-norm.

However, this is not the actual norm of $v$, which would be: $$ ||v(x)||^2 = g_{ij}(x) v^i(x) v^j(x) $$ which accounts for the warping of space induced by $g$.

Question: Does it make sense to use $||\cdot||_E$ here?

I think the Euclidean norm is not a geometric invariant, but does it have any meaning for this specific case?

2

There are 2 best solutions below

0
On BEST ANSWER

Following on from my comment, here's how to compare the norms:

Since $g(x)$ is a symmetric matrix, it has an orthonormal (with respect to the the Euclidean coordinates) basis of eigenvectors. Switching to this basis, $g(x)$ becomes a diagonal matrix with entries being its eigenvalues $\lambda_i(x)$, so we thus have $$\| v(x) \|_g^2=g_{ij}(x) v^i(x)v^j(x)=\sum_i \lambda_i(x)(v^i(x))^2.$$

As the basis is orthonormal, the Euclidean norm still has the expression $\| v(x) \|_E^2 = \sum_i (v^i(x))^2.$ Thus if we let $\lambda(x),\Lambda(x)$ be the minimum and maximum eigenvalues, applying the inequality $\lambda(x) \le \lambda_i(x) \le \Lambda(x)$ to each term in the sum gives the pointwise comparability $$\lambda(x) \|v(x)\|^2_E \le \|v(x)\|_g^2\le\Lambda(x)\|v(x)\|_E^2.$$

Thus on any domain where you have uniform control from above and below of the eigenvalues of $g_{ij},$ you get uniform comparability of the norms.

4
On

It does make sense to use $\|.\|_E$ here

The euclidean norm is the norm on $\mathbb{R}^n$ that is induced by the standard inner product on $\mathbb{R}^n$. The latter is a Riemannian metric on $\mathbb{R}^n$.

So you are really just talking about two Riemannian metrics on $\mathbb{R}^n$.

Moreover, if the norm of a vector is zero in some norm, then it is zero in any norm. (Because $\|v\|_1=0 \Leftrightarrow v=0 \Leftrightarrow \|v\|_2=0$.)