Prove that the Tangent Plane of a Surface is a Vector Space from Definition

810 Views Asked by At

According to my lecture notes, the definition of tangent plane of a surface is given as follows:

Definition: Let $S$ be a regular surface and $p \in S$. Then the tangent plane of $S$ at $p$ is defined as $T_pS=\{ v \in \mathbb{R^3}: \exists \alpha:(-\epsilon,\epsilon) \rightarrow S \text{ such that } \alpha(0)=p, \alpha'(0)=v \}$

I can prove that the tangent plane is closed under scalar multiplications. However, I can't prove that it is closed under vector addition.

The basic difficulty of mine is that in first glance if $\alpha,\beta:(-\epsilon,\epsilon) \rightarrow S$ are two curves on the surface, then I thought the curve $\frac{\alpha(2t)+\beta(2t)}{2}$ could give me the tangent vector $\alpha'(0)+\beta'(0)$. However, the surface $S$ is in general not a vector space, hence the curve $\frac{\alpha(2t)+\beta(2t)}{2}$ is not a well-defined curve on $S$. Then I have no idea how to make the construction.

P.S.: According to the title, the fact that $T_pS$ is the span of the set $\{\frac{\partial F}{\partial x}, \frac{\partial F}{\partial y}\}$, where $F$ is any parametrization of $S$, is not allowed. (In fact, I think that the proof of this fact requires the fact that the tangent plane is a vector space, so using this fact to prove that the tangent plane is a vector space is kind of a circular argument.)

Edit: The definition of regular surface is given as follows:

Definition: Let $S\subset \mathbb{R}^3$. Then $S$ is called a regular surface if for any $p \in S$, there exists a neighborhood $U$ in $S$ of $p$ such that there is a smooth mapping $F:U'\subset\mathbb{R}^2 \rightarrow U$ (called a parametrization) such that

  1. $F$ is homeomorphism.

  2. $\{\frac{\partial F}{\partial x}, \frac{\partial F}{\partial y}\}$ is linearly independent.

2

There are 2 best solutions below

4
On BEST ANSWER

Here's a slightly more pedestrian solution to your particular problem. You've started with curves $\alpha$ and $\beta$ defined on a small interval, which I'll call $I$ to save me from writing greek letters.

The image $\alpha(I)$ (if $I$ is small enough) is contained in an open set $U$ of the kind that appears in the definition of "regular surface", and we can assume that associated to $U$ is a parameterization $F$. For each $t$, define $$ a(t) = F^{-1} (\alpha(t)) $$

Then $a$ is a curve in the plane, right? And $$ \alpha = F \circ a. $$ Similarly, you can define $b$, with $$ \beta = F \circ b. $$ Now it's easy to define $\gamma$ ... you add $a$ and $b$ instead of trying to add $\alpha$ and $\beta$: $$ \gamma(t) = F(a(t) + b(t)) $$ and if $I$ is small enough, then $a(t) + b(t)$ will always be in the domain of $F$, so this makes sense.

Now what's $\gamma'(0)$? Well, if there's any justice in the world, it'll be $\alpha'(0) + \beta'(0)$. And indeed it is that. The proof is to throw in one more function $$ P : \Bbb R \to \Bbb R^2 : t \mapsto a(t) + b(t) $$ so that $$ \gamma = F \circ P, $$ and now you can apply the chain rule.

This is really just @LeeMosher's answer with a lot more detail, but I hope it's of some help to you.

0
On

Here's a hint.

The key is to use the total derivative of $F$, which is the linear mapping $D_qF : \mathbb{R}^2 \to \mathbb{R}^3$ such that $q = F^{-1}(p)$ and such that its matrix is the Jacobian matrix of $F$ at $q$. Then prove that the tangent space of $S$ at $p$ is the image of $D_qF$. Since the image of a linear transformation is a vector space, you'll be done.