What is something (non-trivial) that can be done in Hilbert space but not Banach spaces for optimization problems?

339 Views Asked by At

Sometimes I encounter a book or a research paper where the author insists on working with Banach spaces instead of Hilbert spaces.

I am curious as to what would be a non-trivial difference between these two setups for optimization related applications.

For example, you can still define a Frechet derivative on Banach spaces. That's fine. And the majority of optimization concepts such as strong convexity only involves the norm, and not the inner product. Convergence of a sequence of points $x(k)$ towards the optimum $x^\star$ also involves the norm.

Something trivial that can be done in Hilbert space but not Banach space would be taking the inner product (obvious). But even the inner product can be simply written as the sum of the product of the coordinates of a vector. Hence that can be skipped also.

Is there really any significant difference between workng with Hilbert versus Banach spaces?

3

There are 3 best solutions below

0
On BEST ANSWER

Here is a nice result which can be proven in Hilbert and Banach spaces, but the assumptions hide the fact that the space is already Hilbert:

Let $X$ be a Banach space, $f \colon X \to \mathbb R$ be twice Fréchet differentiable in a given point $\bar x \in X$ and suppose that there exists $\alpha > 0$ such that $$f'(\bar x) = 0, \qquad f''(\bar x)[h,h] \ge \alpha \, \|h\|_X^2 \;\forall h \in X.$$ Then, for all $\varepsilon > 0$ there is $\delta > 0$ such that $$f(x) \ge f(\bar x) + \frac{\alpha-\varepsilon}{2}\,\|x - \bar x\|^2 \quad\forall x \in X, \|x-\bar x\|_X \le \delta.$$

I hope this counts as a non-trivial result. In infinite-dimensional optimization, this theorem is quite useful, since it implies some stability of the minimizer $\bar x$ w.r.t. perturbations of the problem (i.e. a discretization can be seen as a perturbation).

The proof uses just a second-order Taylor expansion of $f$ at $\bar x$ and does not need an inner product or a Hilbert space structure.

However, it can be easily checked that $$ (g,h) \mapsto f''(\bar x) [g,h] $$ defines an inner product on $X$ and the associated norm is equivalent to $\|\cdot\|_X$. Hence, $X$ has to be (isomorphic) to a Hilbert space and the theorem is not applicable in Banach spaces.

0
On

There are closed forms for nearest point projections in Hilbert space that are implementable: e.g., hyperplane and unit ball.

In "nice" Banach spaces, projections do exist but there are typically no formulas available. This is really no fun when dealing with projection or proximal mappings even in finite-dimensional settings. So optimization in Banach spaces is much harder if you want implementable algorithms.

2
On

In Hilbert spaces the squared norm is Frechet differentiable, in particular $$ \frac{d}{dx} || x ||^2 = 2x, $$ which is not true for Banach spaces. Smoothness is a very important property for optimization - not just for methods (gradient descent) but also for optimality conditions ($\nabla f(x) = 0$).