I was trying to understand the derivation of Euler-Lagrange Equations, but there are some things that really don't make sense to me.
For example, in this video at minute 18:50, it is said that we don't want it to have zero variation. $$ \hat{y}(x) = y(x) + \varepsilon \eta(x)\\ \frac{\delta S}{\delta F} = \int_{x_0}^{x_1}{\eta(x) \left \{ \frac{\partial F}{\partial y} - \frac{d}{dx} \left(\frac{\partial F}{\partial y'} \right) \right \} \;dt} = 0\\ \eta \neq 0 \\ \frac{\partial F}{\partial y} - \frac{d}{dx} \left(\frac{\partial F}{\partial y'} \right) = 0 $$
But why? You are seeking for THE function that minimizes something; if you allow it to have some variation, $\eta(x)$, then it is not your function (because you arrive to the function you are seeking for, when the variation is $0$), no? I would appreciate some help and some in-depth exaplanation!!!
It might be a miscommunication. What we want is for the expression \begin{align} \frac{\delta S}{\delta F}[y;\eta]&=\int_{x_0}^{x_1}\eta(x)\left[\frac{\partial F}{\partial y}\bigg|_{(x,y(x),y'(x))}-\frac{d}{dx}\left(\frac{\partial F}{\partial y'}\bigg|_{(x,y(x),y'(x))}\right)\right]\,dx \end{align} to be zero for all sufficiently smooth $\eta$ which vanish at the endpoints. This is just the condition that a function $y$ is a stationary point for the 'action' functional $S$. Now If this integral is $0$ for all such $\eta$, then the fundamental lemma of calculus of variations tells us the thing in square brackets must be zero on the entire interval $[x_0,x_1]$. Thus, you obtain the Euler-Lagrange equations.
Also, 'minimization' is just a colloquial term. This is really just a condition for stationarity.