Convergence Result for Reproducing Kernel Hilbert Spaces

172 Views Asked by At

There is a very interesting convergence result on reproducing kernel (Hilbert spaces) in the book by A. Iske ("Approximation Theory and Algorithms for Data Analysis"), p. :

Theorem 8.39. Let $K(x, y) = \Phi(x-y)$ be positive definite, $K\in \mathbf{PD}_{d}$, where $\Phi: \mathbb R^{d}\rightarrow \mathbb R$ is even and Lipschitz continuous with Lipschitz constant $L>0$. Moreover, let $X\subset \Omega$ be a finite subset of $\Omega\subset\mathbb R^{d}$. Then, we have, for any $f\in F_{\Omega}$, the error estimate $$\vert\vert s_{f, X}-f\vert\vert_{\infty} \leq \sqrt{2Lh_{X, \Omega}}\cdot\vert\vert f \vert\vert_{K}.$$

For the record: $\mathcal F_{\Omega} := \overline{\text{span}\{ K(\cdot, y) \vert y\in\Omega \}}\subset \mathcal F$ was defined on p. 295, and $\mathcal F$ is our RKHS. The proof is also given, and the first line goes as:

Proof: Suppose $y\in\Omega$. Then, there is some $x\in X$ satisfying $\vert\vert y -x\vert\vert_{2}\leq h_{X, \Omega}$.

The so-called fill distance was defined on p. 295 as $$h_{X, \Omega} := \sup_{y\in\Omega} \min_{x\in X}\vert\vert y-x\vert\vert_{2}.$$ QUESTION: I have troubles understanding why $\vert\vert y-x\vert\vert_{2}\leq h_{X, \Omega}$ holds.If we fix $y\in\Omega$, then in my understanding, it should $\vert\vert y-x\vert\vert_{2}\geq h_{X, \Omega}$, simply because of the minimum in the fill-distances that we take over $X$. Many thanks! :)

1

There are 1 best solutions below

2
On BEST ANSWER

For every $y$, since $h_{X,\Omega}$ is a supremum, you have $$\min_x\|y-x\|_2\leq h_{X,\Omega}.$$ Now choose the $x$ that realizes the minimum.