Suppose we have a function $f(x,y)$ differentiable as many times as you like in $\mathbb{R}^2$ the gradient is given by
$$ \nabla f (x,y) = \left(f_x,f_y \right)^T $$
cosine and sine of such vector are given by
$$ \left\{ \begin{array}{l} \cos \alpha = \frac{f_x}{\lVert \nabla f \rVert} \\ \sin \alpha = \frac{f_y}{\lVert \nabla f \rVert} \\ \end{array} \right. , $$ I also define $u_\alpha = (\cos \alpha, \sin \alpha)^T$
I want to compute $ \nabla_{\alpha} \left( \lVert \nabla f \rVert \right) $ which should be given by $$ \nabla_{\alpha} \left( \lVert \nabla f \rVert \right) = \langle \nabla \left( \lVert \nabla f \rVert \right) , u_{\alpha} \rangle = \frac{f_{xx} f_x}{\lVert \nabla f \rVert} \cdot \frac{f_x}{\lVert \nabla f \rVert} + \frac{f_{yy} f_y}{\lVert \nabla f \rVert} \cdot \frac{f_y}{\lVert \nabla f \rVert} = \\ \left(f_{xx} + f_{yy} \right) \cdot \left( \frac{f_x^2}{\lVert \nabla f \rVert^2} + \frac{f_y^2}{\lVert \nabla f \rVert^2} \right) = f_{xx} + f_{yy} = \nabla^2 f $$
The question is: is this derivation of the Laplacian operator rigorous?
The reason of my question is given by the following quote, taken from a computer vision book (the topic is edge detection):
For many applications, however, we wish to think such a continuous gradient image to only return isolated edges, i.e., as single pixels at discrete locations along the edge contours. This can be achieved by looking for maxima in the edge strength (gradient magnitude) in a direction perpendicular to the edge orientation, i.e., along the gradient direction. Finding this maximum corresponds to taking a directional derivative of the strength field in the direction of the gradient and then looking for zero crossing. The desired directional derivative is equivalent to the dot product between a second gradient operator and the result of the first... The gradient dot product with the gradient is called the Laplacian.
Thank you.