Necessary condition for a curve to provide a weak extremum.
Let $x(t)$ be the extremum curve.
Let $x=x(t,u) = x(t) + u\eta(t)$ be the curve with variation in the neighbourhood of $(\varepsilon,\varepsilon')$.
Let $I(u) = \int^b_aL(t,x(t,u),\dot{x}(t,u))dt = \int^b_aL(t,x(t) + u\eta(t),\dot{x}(t) + u\dot{\eta}(t))dt$
Taylor’s theorem indicates that, for $u$ sufficiently small, $I(u)$ can be represented by $$I(u) = I(0) + u \left(\frac{\textrm{d}I}{\textrm{d}u}\right)_{u=0} + O(u^2)$$
My question 1 is: Why does $u$ need to be small to represent the Taylor expansion of $I(u)$? doesn't the Taylor series work for any size $u$?
Suppose the curve $x(t,u) = x(t) + u\eta(t)$ yeild a minimum value. Then because the curve is in the neighbourhood of $(\varepsilon,\varepsilon')$, the condition $u \left(\frac{\textrm{d}I}{\textrm{d}u}\right)_{u=0} = 0$ must necessarily hold as a result of the talyor expansion (above)
My question 2 is: why must it hold as a result of the talyor expansion? shouldn't $ \left(\frac{\textrm{d}I}{\textrm{d}u}\right)_{u=0} = 0$ just in general hold as $I(u)$ has a extremum when $u=0$?
Let me just work in the scalar case of some function $f \in C^\infty(\mathbb{R};\mathbb{R})$ which has a local extremum at $0$.
The Taylor expansion $f(u) = f(0) + u f'(0) + O(u^2)$ only holds for small enough $u$. First, it is only interesting for small $u$ (when $u$ is large, the $O(u^2)$ term is very large and so cannot be interpreted as an "error" term). Second, it is wrong as $u \to +\infty$. For example, take $f(u) := u^4$. Then you obviously don't have $f(u) = O(u^2)$ as $u \to +\infty$.
How do you prove that $f'(0) = 0$ when $0$ is a local extremum? Most proofs I know indeed use the Taylor expansion to prove this fact. I believe this is why your text mentions it.