Shortest distance between two points via calculus of variations

3.3k Views Asked by At

This problem might be trivial but when solving it using calculus of variations it's not so stupid.

Suppose we have a fixed boundary condition $f(a) = f(b) = 0$ and we want to find the shortest distance between two points, so we choose $f$ to minimize the functional $$ I(f) = \int_a^b \sqrt{1+f'(x)^2} dx $$ we solved this in class and got $f'' = 0$ which is a straight line. However, could we have just solved the problem $$ \min I(f) = \int_a^b f'(x)^2 dx $$ since it is a monotonic transformation of the original function? I know this works in calculus but not sure about when dealing with functionals. It gives me the same answer ($f'' = 0$) but is this just coincidental? The second problem involves significantly less work. Thanks!

4

There are 4 best solutions below

3
On

This is an extended comment, not an answer, just to share my thoughts.

You mention that monotone transformations of functions yield equivalent optimization problems. This is true because, taking $g(x)$ to be monotone increasing so that $g'(x)>0$ for all $x$ in the domain, $$\frac{d}{dx}g(f(x))=g'(f(x))f'(x)=0\iff f'(x)=0$$ since $g'(x)$ is never zero.

But I simply cannot find a way to extend this argument to the case of variational problems. If we have a solution to the transformed problem with $g(f(x,y,y'))$ (in your case $g(t)=t^2-1$ and the domain is $t\ge 0$) then we know that $$\frac{\partial }{\partial y}g\big(f(x,y,y')\big)=\frac{d}{dx}\frac{\partial }{\partial y'}g\big(f(x,y,y')\big)$$ which expands to $$g'(f)\frac{\partial f}{\partial y}=g''(f)\frac{df}{dx}\frac{\partial f}{\partial y'}+g'(f)\frac{d}{dx}\frac{\partial f}{\partial y'}.$$ We want this to imply that $$\frac{\partial f}{\partial y}=\frac{d}{dx}\frac{\partial f}{\partial y'}$$ so that we can conclude (under suitable conditions) that we have optimized the original problem. But I can find no way to realize that implication, given only that $g'(t)>0$.

Clearly the proof is done if we have $g''(t)\equiv 0$, that is, if $g$ is a linear function. Unfortunately yours is quadratic.

0
On

I would say that the result is coincidental.

Consider minimization of the functional: $$ I(f)=\int_a^b\sqrt{{f(x)}^2+{f'(x)}^2}\;dx.\tag1 $$

If I correctly understand your point, the alternative way in this case would be the minimization of $$ I^*(f)=\int_a^b\left({f(x)}^2+{f'(x)}^2\right)\;dx.\tag2 $$ But these two functionals lead to completely different Euler-Lagrange equations: $$\begin{align} f(f^2+2f'^2-f''f)=0;\tag{1'}\\ f-f''=0,\tag{2'} \end{align}$$ which have (except for the case $f(a)=f(b)=0$) completely different solutions.

2
On

Consider the general functional

$$\int_a^b\Phi(f'(x))\,dx.$$

The Euler-Lagrange equations read

$$\frac{d}{dx}\frac\partial{\partial f'}\Phi(f')=0$$

and

$$\frac{d}{dx}\Phi'(f')=f''\Phi''(f')=0.$$

Hence $f''(x)=0$ is always a solution, as is $f'(x)=\Phi''^{-1}(0)$, a constant.

0
On

Yes, it is expected, not coincidental. That is how it should be as per calculus of variations concepts verifiable using Euler-Lagrange. The effect is the same in the two cases you mention. However the cause is different, it is not from monotonic changes from the functional. It is a class (or type) of functionals dealt with Euler-Lagrange through this EL uniform procedure. Let us see how.

Application of Euler-Lagrange on functionals

$$ \sqrt{1+ f^{'2}(x)}, f^{'2}(x),\dfrac{1- f^{'2}(x)}{\sin(1+ f^{'7}(x))},\;...$$

all lead to the very same solution viz., $ y'=$ a constant, the straight lines.

This happens due to the appearance of a single derivative variable $y'$ dependent pure functional.

For example when you take another hypothetical

$$ \int y' (1- y^{'2})dx$$

after extremization the solution should be the same again, you can verify with EL Equn.

For that matter in general any $f(y')$ pure functional (without x) would lead to the same straight lines for a solution, viz., $ y^{'}=const,\;y^{''}=0.$

I am supplying the proof here that any functional dependent purely on $y'$ would lead to straight line solutions.

$$ L= \int f(y') dx \tag1$$ Beltrami's formula when devoid of $x$ or its functions gives:

$$ f(y') -y'\cdot \dfrac{df(y')}{dy'}=C_1 \tag2 $$

(Full derivative as for single variable, $y$ terms treated constant when differentiating wrt $y'$).

For symbolic brevity let $u= y'$ operating as independent variable.

$$ f(u)-u\cdot\dfrac{d f(u)}{du}=C_1\tag3$$

$$\dfrac{df(u)}{f(u)-C_1}=\dfrac{du}{u} \tag4$$

Integrating, $$ log [f(u)-C_1]= log \;u + log\;C_2 \tag5 $$

$$ f(u)= C_2 \;u +C_1\tag6 $$

$$ f(y')= C_2\; y'+ C_1\tag7 $$

Differentiating with respect to $x$

$$\dfrac{df(y')}{dx}= C_2\; y^{''} \tag8$$

The LHS should vanish as we began with the premise/assumption that $f'(u)$ does not involve (is independent of) $x$ explicitly.

$$ y''=0 \rightarrow y= C_3 x + C_4 \tag9$$ are straight lines in the plane for all such functionals as a class.