It's common to see in calculus of variation that the integrand $f$ of functional $F[y]=\int f(y,y',x)dx$ is a function of $y,y'$ and $x$. Why do we regard the derivative $y'$ as an independent variable to $y$? And why don't we involve $x$ explicitly in $\displaystyle \delta F=\int\delta f~dx=\int\left(\frac{\partial f}{\partial y}\delta y+\frac{\partial f}{\partial y'}\delta y'\right)~dx$ ?
Independence of function and its derivative in calculus of variations
2.2k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
The derivative of a function is only so much related to the base function. Having proved anything for a function (or it's derivative) entails very little for the other one.
However, I think what looks suspicuous to you is the list of arguments in the functional $f(x,y,y')$. In this regard: try to remind yourself, that the argument list (mostly omitted anyway) is only a reminder to the reader and specifies which quantities a varible or function explicitly depends on! If your functional would for example not depend on $x$, the lagrange equations reveal a conservation (in mechanics this is the generalized conservation of momentum). If the problem does not involve $y'$, you can treat the resulting Euler Lagrange equations yet another way and find solutions by especially simple means. As pointed out by the other answer, your functional can also involve higher derivatives of $y$ up to order $n$. Then you'd have to acknowledge this fact and use a little different equations to solve the problem. However, what matters is, that the argument list is there only to specify for you what you are dealing with.
Thus I hope I could make clear, that the list of arguments $f(x,y,y')$ says nothing about dependence between the arguments themselves - $y$ is by definition dependent on $x$, if $y'$ should be some nonvanishing quantity!
The variable $x$ does not explicitly appear in the variation, because the variation is carried out wrt a function $y$. If you consider a family of functions depending on $\epsilon$ and walk down the definition of the variation you can see, that the variation is similar to a partial derivative wrt this $\epsilon$! Since a partial derivative considers every other quantity to be "fixed" you can for illustration purposes regard every variation as undergone at fixed $x$!
Hope this clarifies your 2 questions specifically. Otherwise please specify more clearly where the confusion lies.
A functional $F$ is a map that takes functions from an appropriate functional space and returns numbers. If the functional $F$ is represented through an integral like in the OP, then the Lagrangian $f=f(x,y,y')$ is generally seen as a function of the variable $x$ and the functions $y$, $y′$. The Lagrangian can be dependent of $x$ and $y$, i.e. $f=f(x,y)$, or dependent of higher derivatives, like in the $f=f(x,y,y,y'',\dots)$ case. The number and type of derivatives considered in the Lagrangian depends on the applications. The Euler Lagrange equations are also dependent on the structure of the Lagrangian, of course.
On the Euler Lagrange eqs: the estremals of $F$, if they exist, are solutions of the equation $\delta F=0$, with $$\delta F=\frac{dF}{d\epsilon}|_{\epsilon=0}:=\lim_{\epsilon\rightarrow 0}\frac{F[y+\epsilon\phi]−F[y]}{\epsilon},$$ for all functions $\phi$ "near to" $y$ (called variations), such that $y(a)=(y+\epsilon\phi)(a)$, $y(b)=(y+\epsilon\phi)(b)$. The resulting equations for $y$ (and not $\phi$!) are called the Euler Lagrange equations. For more examples I refer to these lecture notes.
In other words, one searches for those functions $\phi$ perturbing $y$ and "near" to $y$ s.t. the rate of change of $F$ is zero. The mathematical meaning of "perturbing" and "near" is expressed by the formulation above for $\phi$.
You can check this answer for additional details on Euler Lagrange equations; in particular you can find infos on why one does not consider $dx$ in the variational problem