Jacobi fields are linearly independent if and only if...

290 Views Asked by At

If the dimension of the Riemannian manifold $M$ is $n$, there exist exactly $n$ linearly independent Jacobi fields along the geodesic $\gamma : [0,a] \to M$, which are zero at $\gamma(0)$. This follows from the fact, easily checked, that the Jacobi fields $J_1,\ldots,J_k$ with $J_i(0)=0$ are linearly independent if and only if $J_1'(0),\ldots,J_k'(0)$ are linearly independent.

(Riemannian geometry, Manfredo do Carmo, page 116)

I tried to "prove" first the reverse direction: If $J_1'(0),\ldots,J_k'(0)$ are linearly independent, then $\sum_i c_i J_i'(0)=0 \iff c_i=0$. Applying the limit definition and $J_i(0)=0$, we have that $\lim_{t \to 0^+} \frac{\sum_i c_iJ_i(t)}{t}=0$. Then I used the epsilon-delta definition of the limit: $\forall \epsilon > 0, \exists \delta > 0$ such that $|\frac{\sum_i c_i J_i(t)}t-0|<\epsilon$. Well, $t$ is defined only on $[0,a]$, so $\frac 1t \ge \frac 1a$, and so $\frac{|\sum_i c_i J_i(t)|}a\le|\frac{\sum_i c_i J_i(t)}t-0|<\epsilon$, and so $|\sum_i c_i J_i(t)| < \epsilon a$. Since the inequality we just deduced holds true for all $\epsilon > 0$, we must have that $\sum_i c_i J_i(t) = 0$ if $0 < t < \delta$. Since $c_i=0$, it follows that $\sum_i c_i J_i(t) = 0$ for all $0 \le t \le a$.


I don't feel good about this alleged proof; I know there are some statements that don't follow (although they seemed like they would to me). Furthermore, how would I also go about proving the other direction? Please do let me know if there is a better proof that I should follow.

2

There are 2 best solutions below

0
On

Here is an alternative proof. We know the explicit form of a Jacobi field $J$ with $J(0)=0$. See Corollary 2.5 of Chapter 5 of Do carmo. Now use this explicit formula with the fact that derivative is a linear map. At $t=0$ they are linearly independent by definition. At other points $t\neq 0$ therefore we can use formula.

1
On

Remember that a jacoby field satisifies a second order differential equation $$J'' + RJ = 0.$$ This means if $J(0) = J'(0) = 0$, then $J\equiv 0$.

Assume $J_1,\ldots,J_n$ satisfy $J_i(0) = 0$ and that $\{J_1,\ldots,J_n\}$ are linearly independent. Suppose there are constants $c_1,\ldots,c_n\in\mathbb{R}$ so that $$c_1J_1'(0) + c_2J'_2(0) + \cdots c_n J_n'(0) \equiv 0.$$ Then the jacoby field $$J := c_1 J_1 + c_2J_2 +\cdots + c_n J_n$$ satisifies $J(0)=0$ and $J'(0) = 0$. So $J\equiv 0$. Since we assumed $\{J_1,\ldots,J_n\}$ are linearly independent, this means $c_1 = c_2=\cdots = c_n = 0$ proving if $\{J_1,\ldots,J_n\}$ are linearly independent then so is $\{J_1'(0),\ldots,J_n'(0)\}$

Conversely, suppose that $\{J_1'(0),\ldots,J_n'(0)\}$ are linearly independent and that there are constants $c_1,\ldots,c_n\in\mathbb{R}$ such that $$ c_1 J_1 + c_2J_2 +\cdots + c_n J_n\equiv 0.$$ Then taking derivatives of this equation gives us that $$ c_1 J'_1 + c_2J'_2 +\cdots + c_n J'_n\equiv 0.$$ Evaluating this at 0 gives $$ c_1 J'_1(0) + c_2J'_2(0) +\cdots + c_n J'_n(0)= 0 $$ and by our assumption this gives $c_1 = c_2=\cdots = c_n = 0$ proving if $\{J_1'(0),\ldots,J_n'(0)\}$ are linearly independent then so is $\{J_1,\ldots,J_n\}$.