I'm reading through this paper, where a variational framework to some computer vision feature detection techniques is given. I'm familiar with variational techniques, though I'm not an expert. There's the following integral (formula (9) on the paper)
$$ E(\vec{n}) = \oint_{0}^L sign(\langle v, \nabla I \rangle) \langle \nabla I, \vec{n} \rangle ds $$
It is claimed that the corresponding Euler-Lagrange equation is given by
$$ sign(\langle v, \nabla I \rangle) \Delta I = 0 $$
Where $\Delta $ is the laplacian operator. In the above integral $I$ is a 2D image, $\vec{n}$ is the normal to some curve $\gamma$ that we want to find, $v$ is vector orthogonal to the level set.
I'm quite confused how the Euler lagrange equations might be derived, and I don't actually know where I can start. Either an explanation or just a clue would be useful,
thank you.