Strictly positive definiteness and functions in the RKHS

17 Views Asked by At

A (real-valued) kernel $k$ on $X$ is a positive semi-definite symmetric map $X \times X \to \mathbb{R}$. Specifically, for any $n \in \mathbb{N}$, $x_1,...,x_n \in X$ and $\alpha_1,...,\alpha_n \in \mathbb{R}$ we have that $$\sum_{i,j=1}^n \alpha_i \alpha_j k(x_i,x_j)\geq 0.$$ Now, it's easy to see that $$\sum_{i,j=1}^n \alpha_i \alpha_j k(x_i,x_j) = \left|\left| \sum_j \alpha_j k_{x_j}\right|\right|^2_H,$$ where $H$ is the associated RKHS. This way we see that

$$\sum_{i,j=1}^n \alpha_i \alpha_j k(x_i,x_j) = 0 \quad \text{if and only if} \quad || \sum_j \alpha_j k_{x_j}||_H = 0.$$

Hence, for every $f \in H$, $\sum_j \alpha_j f(x_j) = \langle f , \sum_j \alpha_j k_{x_j} \rangle = 0$, meaning there is an equation of linear dependence between the values of every function $f\in H$ at this finite set of points.

Now, for instance, if $H$ contains all polynomials $p$, then it's impossible to satisfy an equation of the form $\sum_j \beta_j p(x_j)$ with $\beta_j$ not all zero. But, what is actually at game here? Is it the fact that the polynomials form a point separating algebra? I want to identify the "underlying property" of a set of functions that makes impossible to have an equation of linear dependence for said set evaluated at a finite set of points.