How to express the Frechet derivative in terms of a gradient vector.

345 Views Asked by At

Is it possible to express the Frechet derivative in terms of gradient vectors in normed vector spaces? Since the definition tells us the following,

Let $V$ and $W$ be normed vector spaces, and $U\subseteq V$ be an open subset of $V$. A function $f:U\to W$ is called Fréchet differentiable at $x\in U$ if there exists a bounded linear operator $A:V\to W$ such that $$\lim _{\|h\|\to 0}{\frac {\|f(x+h)-f(x)-Ah\|_{W}}{\|h\|_{V}}}=0$$

So, I was wondering how that definition can be expressed in terms of gradient vector.

2

There are 2 best solutions below

0
On

Gradient vectors are a secondary concept. You should think of Frechet derivatives as the primary concept. Anyway, if you really insist, here's what we can say.

Let $H$ be a real Hilbert space. Let $\flat:H\to H^*$ be the mapping $x\mapsto x^{\flat}=\langle x,\cdot\rangle$. This is a linear injection, and Riesz's theorem tells us it is in fact surjective, so this mapping is a linear isomorphism. We denote the inverse by $\sharp:H^*\to H$. If you consider complex Hilbert spaces where $\langle \cdot, \cdot\rangle$ is conjugate-linear in say first slot and linear in the second slot, then the same definitions apply.

Let $U\subset H$ open and $f:U\to \Bbb{R}$ Frechet differentiable at a point $a\in U$. We define the gradient vector of $f$ at $a$ to be \begin{align} (\text{grad } f)(a):= \left(Df_a\right)^{\sharp} \end{align} In other words, the gradient is the unique vector in $H$ such that for all $\xi\in H$, we have \begin{align} \langle (\text{grad } f)(a), \xi\rangle&= Df_a(\xi). \end{align}

So, the gradient vector is only defined when the domain is a Hilbert space, and the target space is the underlying field. Also, the gradient depends on the choice of inner product on the space. If we modify the inner product, then by definition we'll generically end up with a different vector. On the other hand, the Frechet derivative is defined even in the general Banach space setting.

0
On

If the dimension of the Banach spaces involved are finite, then you can use the Jacobian matrix to represent the Fréchet derivative, and each row of the Jacobian matrix can be understood as a gradient of a projection.

That is, if $f:\mathbb{R}^n \to \mathbb{R}^m$ is differentiable, then it Jacobian matrix can be written as

$$ [\partial f(x)]=\begin{bmatrix} \nabla f_1(x)\\\vdots\\ \nabla f_m(x) \end{bmatrix} $$

for $f=(f_1,f_2,\ldots ,f_m)$ and $f_k:\mathbb{R}^n \to \mathbb{R}$ for each $k$.