In wiki http://en.wikipedia.org/wiki/Calculus_of_variations#Example, the first example of calculus of variation is the minimize distance between 2 points. In my understanding, value of functional $J$ depends on a function $f$ and certain parameters, so we can find $f$ such that $J$ is minimized. However, in the examples, the 2 points are $(x_1,f(x_1))$ and $(x_2,f(x_2))$, which both depends on $f$. That means for different $f$, these 2 points are different, so does their minimum distance. So when we say the solved $f$($=\frac{y_2-y_1}{x_2-x_1}*x+\frac{x_2y_1-x_1y_2}{x_2-x_1}$) has minimum distance, what other functions are we comparing it to? Why is this $f$ the desired function?
2026-03-28 10:26:59.1774693619
On
Doubt on calculus of variation
300 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
2
On
The logic goes like this. Suppose $f(x)$ is the function that minimizes $A(y)$, subject to $f(x_1)= y_1$, $f(x_2) = y_2$. Here, since we take the 2nd derivative, we need $f \in C^2$, although this can probably be relaxed. Then the derivation shows that such $f(x)$ is a linear function.
Accidentally, we have shown that the function will stay linear no matter which endpoints $y_1, y_2$ we pick. Of course, the function itself will be different for other $y_1, y_2$, but still linear.
In that example you are only considering functions which have the endpoints $(x_1,y_1)$ and $(x_2,y_2)$. There are many curves which connect those points on a piece of paper but only one will yield the minimal distance.
In this example we know that the minimal curve is the straight line,
$$ L(x) = \frac{y_2-y_1}{x_2-x_1} (x-x_1) + y_1.$$
As an exercise I recommend you substitute the following function into the distance functional. This is different from the straight line but still passes through the two end points $(x_1,y_1)$ and $(x_2,y_2)$ and is therefore in the family of functions being considered,
$$f(x) = L(x) + \lambda \sin\left(\frac{x-x_1}{x_2-x_1} \pi\right)$$
What you should find is that the result will depend on $\lambda$ and that as a function of $\lambda$ it will minimize to the value given by just substituting $L$.
Just to be clear the functional I am referring to is,
$$ D\left[f(x)\right] = \int_{x_1}^{x_2} dx \sqrt{1+\left( df/dx \right)^2},$$
notice that $x_1$ and $x_2$ appear explicitly in the bounds of the integral.
Edit: An attempt to address the comment,
The solution cited uses the Euler-Lagrange equation to derive the differential equation that $L(x)$ satisfies. The derivation of this equation requires that the end points be fixed. By fixed I mean that all allowed functions pass through $(x_1,y_1)$ and $(x_2,y_2)$.
The way this is done is to suppose that the minimizing functions exists and to then write a generic function in terms of the minimizer and an auxiliary function $\eta$
$$ f(x) = L(x) + \eta(x) $$
Where the function $\eta(x)$ must equal $0$ at $x_1$ and $x_2$ in order to allow $f(x)$ and $L(x)$ to have common end points.
We will derive the differential equation for $L$ in the case of the distance functional. For the purpose of this derivation we will suppose that the functions $\eta(x)$ is a small deviation from $L$ so that we may neglect higher order powers of $\eta$.
$$ D\left[f\right] = \int_{x_1}^{x_2} dx \sqrt{1+(dL/dx+d\eta/dx)^2} $$
$$ = \int_{x_1}^{x_2} dx \sqrt{1+(dL/dx)^2+2(dL/dx)(d\eta/dx)} $$
$$ = \int_{x_1}^{x_2} dx \sqrt{1+(dL/dx)^2}\sqrt{1+2\frac{(dL/dx)(d\eta/dx)}{1+(dL/dx)^2}} $$
$$ = \int_{x_1}^{x_2} dx \sqrt{1+(dL/dx)^2}\left(1+\frac{(dL/dx)(d\eta/dx)}{1+(dL/dx)^2}\right) $$
$$ = \int_{x_1}^{x_2} dx \left[ \sqrt{1+(dL/dx)^2}+\frac{dL/dx}{\sqrt{1+(dL/dx)^2}} \frac{d\eta}{dx} \right] $$
Integrating the second term by parts we get,
$$ = \int_{x_1}^{x_2} dx \sqrt{1+(dL/dx)^2}-\int_{x_1}^{x_2} dx \frac{d}{dx}\left[\frac{dL/dx}{\sqrt{1+(dL/dx)^2}}\right]\eta + \left[ \frac{dL/dx}{\sqrt{1+(dL/dx)^2}} \eta(x) \right]_{x_1}^{x_2} $$
This is the crucial step: The fixed end points are important at this step becuase they tell us that $\eta(x_1)=\eta(x_2)=0$. Therefore the last term in the line above is equal to zero. Leaving us with,
$$ = \int_{x_1}^{x_2} dx \sqrt{1+(dL/dx)^2}-\int_{x_1}^{x_2} dx \frac{d}{dx}\left[\frac{dL/dx}{\sqrt{1+(dL/dx)^2}}\right]\eta $$
$$ = D\left[ L(x) \right] -\int_{x_1}^{x_2} dx \frac{d}{dx}\left[\frac{dL/dx}{\sqrt{1+(dL/dx)^2}}\right]\eta $$
Now by assumption $$D[L(x)]$$ is a minimum therefore the first order deviation for this minimum must be zero. This can only be accomplished if the piece of the integrand multiplying $\eta$ is identically zero since $\eta$ is an arbitrary function.
$$ \frac{d}{dx}\left[\frac{dL/dx}{\sqrt{1+(dL/dx)^2}}\right] = 0 $$
Integrating we get,
$$ \frac{dL}{dx}\sqrt{1+(dL/dx)^2} = C_1$$
$$\Rightarrow (1-C_1^2)\left(\frac{dL}{dx}\right)^2 = C_1^2$$
$$\Rightarrow \frac{dL}{dx} = \sqrt{\frac{C_1^2}{1-C_1^2}}$$
$$\Rightarrow L= x\sqrt{\frac{C_1^2}{1-C_1^2}}+C_2$$
We relable the constants,
$$ L = ax+b,$$
and we note that $L$ only passes through the correct end points if $a$ and $b$ are chosen as indicated in the first part of the answer.