Is it possible to express the Frechet derivative in terms of gradient vectors in normed vector spaces? Since the definition tells us the following,
Let $V$ and $W$ be normed vector spaces, and $U\subseteq V$ be an open subset of $V$. A function $f:U\to W$ is called Fréchet differentiable at $x\in U$ if there exists a bounded linear operator $A:V\to W$ such that $$\lim _{\|h\|\to 0}{\frac {\|f(x+h)-f(x)-Ah\|_{W}}{\|h\|_{V}}}=0$$
So, I was wondering how that definition can be expressed in terms of gradient vector.
Gradient vectors are a secondary concept. You should think of Frechet derivatives as the primary concept. Anyway, if you really insist, here's what we can say.
Let $H$ be a real Hilbert space. Let $\flat:H\to H^*$ be the mapping $x\mapsto x^{\flat}=\langle x,\cdot\rangle$. This is a linear injection, and Riesz's theorem tells us it is in fact surjective, so this mapping is a linear isomorphism. We denote the inverse by $\sharp:H^*\to H$. If you consider complex Hilbert spaces where $\langle \cdot, \cdot\rangle$ is conjugate-linear in say first slot and linear in the second slot, then the same definitions apply.
Let $U\subset H$ open and $f:U\to \Bbb{R}$ Frechet differentiable at a point $a\in U$. We define the gradient vector of $f$ at $a$ to be \begin{align} (\text{grad } f)(a):= \left(Df_a\right)^{\sharp} \end{align} In other words, the gradient is the unique vector in $H$ such that for all $\xi\in H$, we have \begin{align} \langle (\text{grad } f)(a), \xi\rangle&= Df_a(\xi). \end{align}
So, the gradient vector is only defined when the domain is a Hilbert space, and the target space is the underlying field. Also, the gradient depends on the choice of inner product on the space. If we modify the inner product, then by definition we'll generically end up with a different vector. On the other hand, the Frechet derivative is defined even in the general Banach space setting.