Can the directional derivative be considered a total derivative?

406 Views Asked by At

I guess I am having a little problem relating several concepts in one package.

The partial derivative Vs. Total derivative. Total derivative appears to be a place holder where as the partial derivative assumes the other variable is fixed with one variable varies or visa versa.

Now the direction derivative is not a vector at all but a unit vector is used to make it's calculation. The fact that the unit vector has length one gets you off the hook but the result is NOT a vector, it's something that looks like a total differential.

Lastly but not part of the question, although it begs not to be ignored. If you dot product the gradient which is a vector expressed by partial derivatives with the unit vector you get the directional derivative.

And here is the fun part , the book seems to pat itself on the back that that will be the direction of maximum ascent or decent. And there lies my other confusion the gradient has no degrees of freedom at a given point ( x, y) so it has no choice ...any other point isn't even on the graph. So why do people chase their tail with the directional derivative , obviously I am missing the point of the directional derivative so my question is a simple one just to get started: Is the directional derivative a total derivative ?

Allow me to start with that first.

2

There are 2 best solutions below

1
On BEST ANSWER

Directional derivatives are a generalization for partial derivatives when the direction does not coincide with the direction of the axes.

For example, for a function of $2$ variables, for any $\vec v=(a,b)$ we define the directional derivative as:

$$\frac{\partial f}{\partial \vec v}=\lim_{h\to 0}\frac{f(x_0+ah,y_0+bh)-f(x_0,y_0)}{h}$$

and often the directional derivative are defined assuming $\vec v$ as a unit vector, indeed it can be shown that:

$$\frac{\partial f}{\partial \lambda \vec v}=\lim_{h\to 0}\lambda\frac{f(x_0+\lambda ah,y_0+\lambda bh)-f(x_0,y_0)}{\lambda h}=\lambda \frac{\partial f}{\partial \vec v}$$

As already mentioned, note that the partial derivatives are exactly the directional derivatives corresponding to the unit vectors $(1,0)$ and $(0,1)$.

Finally when the total derivative exist, that is $f$ is differentiable, the following holds:

$$\frac{\partial f}{\partial \vec v}=\nabla f\cdot\vec v=v_1\frac{\partial f}{\partial x}+v_2\frac{\partial f}{\partial y}$$

0
On

Regarding the direction of the maximum ascent, fix $\vec{x} = (x_1, x_2)$ such that $\nabla f(\vec{x}) \ne \vec{0}$, and define $$ \vec{u} := \frac{\nabla f(\vec{x})}{\lVert \nabla f(\vec{x}) \rVert} $$ ($\vec{u}$ is a "positive" direction of the gradient, sort of). We want now to compare the ascent rate along various unit vectors $\vec{v}$. Any $\vec{v}$ can be written as ${\alpha}\vec{u} + {\beta} \vec{w}$, where $\sqrt{\alpha^2 + \beta^2} = 1$ and $\vec{w} \cdot \vec{u}$ (in other words, $\vec{w}$ is orthogonal to the gradient). We consider $$ \tag{$*$} \lim\limits_{h \to 0^+} \frac{f(\vec{x} + h \vec{v}) - f(\vec{x})}{h}. $$ One can write $$ f(\vec{x} + h \vec{v}) = f(\vec{x} + h(\alpha \vec{u} + \beta \vec{w})) = f(\vec{x}) + h(\alpha \vec{u} + \beta \vec{w}) \cdot \vec{u} + o(h), $$ which is equal to $$ f(\vec{x}) + h \alpha + o(h), $$ where $o(h)/h \to 0$ as $h \to 0^+$. Plugging the above into ($*$) we obtain that, among unit vectors, the maximum ascent rate is along the positive direction of the gradient ($\alpha = 1$, $\beta = 0$), and the minimum ascent rate is along the negative direction.

Another important property of the gradient is that it is everywhere orthogonal to the level set of the function. The proof of it requires application of the Implicit Function Theorem.

Finally, I tacitly assumed that $f$ is of class $C^1$ (this is equivalent to its partial derivatives being continuous).