To show that $p$ is a stationary point of a function $f:\mathbb{R}^n\longrightarrow\mathbb{R}$, one needs to guarantee that $f$'s directional derivative at $p$ is zero in every direction. To do so, it is sufficient to check that all n partial derivatives are zero at $p$. That is, $f$ is stationary at $p$ if: \begin{equation} \frac{\partial f}{\partial x_i}\Bigg|_p=0,\;\;\mathrm{for}\;\;\;i=0,1,\cdots ,n \end{equation} This comes from the fact that $\mathrm{dom}(f)$ having an n-element basis means that the tangent space at $p$ does too.
If we instead have a functional $F$ on some function space, say $L^2(X)$, we can still talk about a basis $\{\varphi_i\}$ for $\mathrm{dom}(F)$, e.g Fourier series or some orthogonal polynomial basis. I would attempt to define partial derivatives with respect to this basis as follows: \begin{equation} \frac{\partial F}{\partial \varphi_i}\Bigg|_\rho = \lim_{\varepsilon\to 0} \frac{F(\rho + \varepsilon \varphi_i)-F(\rho)}{\varepsilon} \end{equation} where $\rho\in\mathrm{dom}(F)$.
My first question is: if the "partial derivative" defined above is zero for all of $\{\varphi_i\}$, is $\rho$ then a stationary point of $F$, say for the purposes of variational calculus? More abstractly, is there a sense in which a basis for $\mathrm{dom}(F)$ gives us a basis for its "tangent space?"
Secondly, I'm curious to know if this is a rigorous way to define functional derivatives, since physics textbooks that I've seen are a bit handwavy about the definition, while Wiki invokes something called a Radon-Nikodym derivative which I'm not familiar with. Any insight or references to further reading would be appreciated!
Too long for a comment but perhaps worth lookig.
At first reaction over that, I am conducted to the example of a functional of the form $$A=\int_IL(x,y,y')dx$$ which, thru the perturbation $$A_{\varepsilon}=\int_IL(x,y+\varepsilon h,y'+\varepsilon h')dx$$ one is granted with $$\dfrac{A_{\varepsilon}-A}{\varepsilon}= \int_I\left(\dfrac{\partial L}{\partial y}-\frac{d}{dx} \dfrac{\partial L}{\partial y'}\right)h\ dx+O(\varepsilon h)$$ which is got from the expansion of $A_{\varepsilon}$.
Now if $h=\sum_sh^s\varphi_s$ for some basis on admissible family of functions with a classical condition, which says, for $I=[a,b]$, $$h(a)=h(b)=0,$$ then that is forcing us to demand $\varphi_s(a)=\varphi_s(b)=0$, for all $\varphi_s$, to see $$\lim_{\varepsilon\to0}\dfrac{A_{\varepsilon}-A}{\varepsilon} = \int_I\left(\dfrac{\partial L}{\partial y}-\frac{d}{dx} \dfrac{\partial L}{\partial y'}\right)\sum_sh^s\varphi_s\ dx $$ $$\qquad\quad\qquad= \sum_sh^s \int_I\left(\dfrac{\partial L}{\partial y}-\frac{d}{dx} \dfrac{\partial L}{\partial y'}\right)\varphi_s\ dx,$$ a desirable linear property for derivatives.