I have a functional $F$ defined on, say, $C^{\infty}_0(\mathbb{R})$, of the following form:
$$ w(\cdot) \stackrel{F}{\mapsto} \int e^{ w(x) - \int w(x') g(x', w(\cdot))dx' } dx $$
where $g(x, w(\cdot))$ is itself a functional on $C^{\infty}_0(\mathbb{R})$ for each $x$.
Question: What is the Frechet derivative of $F$?
Assume:
Then, the function $\displaystyle \varphi(x) = \int\limits_a^b dt\ f(t, x)$ defines a continuous (resp. differentiable with continuity) function $\mathrm{O} \to \mathrm{E}_2$ (and its derivative is given by "Leibniz's rule" of "differentiation under integral sign": $\displaystyle \mathbf{D} \varphi(x) = \int\limits_a^b dt\ \mathbf{D}_2 f(t, x)$).
Given $\varepsilon > 0$ and $x \in \mathrm{O},$ continuity of $f$ at $(t, x)$ implies the existence of a neighbourhood $\mathrm{N}_t \times \mathrm{V}_t = (t - \delta_t, t + \delta_t) \times \mathrm{B}(x; \eta_t)$ such that for $(s, y)$ there $\|f(s, y) - f(t, x)\| < \dfrac{\varepsilon}{b - a + 1}$; the theorem of Borel-Lebesgue (compactness of $[a, b]$ essentially) there are finitely many $t_i$ such that the $\mathrm{N}_{t_i}$ cover $[a, b]$ and so, one can extract the minimum of the $\delta_{t_i}$ and $\eta_{t_i}$ and it is easy to verify (using the mean value theorem for vector valued functions) that if $\delta$ is said minimum then $\delta > 0$ and $\|x -y\| < \delta$ implies $\|\varphi(x) - \varphi(y)\| < \varepsilon.$ Copying verbatim the argument given shows the case for derivatives (bearing in mind that for integrals and continuous linear maps $\mathrm{L}$ one has $\displaystyle \mathrm{L} \int\limits_a^b dt\ u(t) = \int\limits_a^b dt\ \mathrm{L}u(t)$).
To your exercise: I will assume that $\mathscr{C}^\infty_0(\Bbb R)$ is the closure (hence, complete) in $\mathscr{B}(\Bbb R)$ (the space of bounded functions $\Bbb R \to \Bbb R$ with the supremum norm) of the space of infinitely many times differentiable functions whose support is compact.
Define $\varphi:\Bbb R \times \mathscr{C}_0^\infty(\Bbb R) \to \Bbb R$ given by $(y, \omega) \mapsto \omega(y);$ it is readily seen that $\varphi$ is bilinear and continuous (actually, $\| \varphi \| \leq 1$), hence $$\dfrac{\partial \varphi}{\partial \omega}\Bigg|_{(y, \omega)} \omega' = \omega'(y).$$ By hypothesis, $g:\Bbb R \times \mathscr{C}_0^\infty(\Bbb R) \to \Bbb R$ is continuous and has a continuous partial derivative with respect to the second factor given by $$\omega' \mapsto \dfrac{\partial g}{\partial \omega} \cdot \omega'.$$ Thus, the product $g \varphi$ has partial derivative given by $$\dfrac{\partial g \varphi}{\partial \omega}\Bigg|_{(y, \omega)} \omega' = \omega'(y) g(y, \omega) + \omega(y) \dfrac{\partial g}{\partial \omega} \cdot \omega'.$$
It is readily seen then that $$\dfrac{\partial \mathrm{F}}{\partial \omega}\Bigg|_{(y, \omega)} \omega' = \int\limits_{\Bbb R} dx\ \exp \left( \omega(x) - \int\limits_{\Bbb R} dy \omega(y) g(y, \omega) \right) \times \left( \omega'(x) - \int\limits_{\Bbb R} dy\ \left[ \omega'(y) g(y, \omega) + \omega(y) \dfrac{\partial g}{\partial \omega} \cdot \omega' \right] \right).$$