Why $\Sigma$ is minimal, if $\frac{d}{dt} |_{t=0} \mathrm{Area}(\Sigma_t)=0$?

137 Views Asked by At

In this work http://arxiv.org/pdf/1204.2883v1.pdf Martin Li claimed that $\Sigma\subset M$ is minimal and $\Sigma$ meets $\partial M$ orthogonally along $\partial \Sigma$ if, only if,

$$0 = \frac{d}{dt} |_{t=0} \mathrm{Area}(\Sigma_t) = - \int_{\Sigma} \langle H, X\rangle da + \int_{\partial \Sigma} \langle X, \nu\rangle ds,$$

where $\nu$ is the outer conormal vector of $\partial \Sigma$ in (i.e. the outward unit normal of $\partial \Sigma$ tangent to $\Sigma$), H is the mean curvature vector of in M and $X$ is the variation field associated with the smooth family $\{\Sigma_t\}$.

So, I not understand because this is clear. Why each term of the right side is zero?

Someone help me? Thak you!

1

There are 1 best solutions below

1
On BEST ANSWER

I didn't read the article but I guess the reason is, 1. H is zero since $\Sigma$ is minimal, so the first term is 0; 2. X is chosen to be tangent to $\partial{M}$ so it is orthogonal to $\Sigma$, so the second term is 0