Reproducing Kernel Hilbert Space norm is measure invariant?

91 Views Asked by At

I'm reading the book 'Gaussian Processes for Machine Learning" by Rasmussen and William's chapter 6. They claim on page 131 that given some choice of positive definite kernel K(x,y) defined on some space X, that the RKHS norm is invariant of the choice of measure on X. Is this true? I can't see why it should be.

For instance suppose we define a positive definite kernel on the interval $[0,1]$, that is $K:[0,1]^2\rightarrow \mathbb{R}$. Mercer's theorem allows us to express $K$ wrt an orthonormal basis of functions $\sigma_i$ which are eigenfunctions of the associated Hilbert-Schmidt operator for $K$. i.e. we have $K(s,t) = \sum_{i=1}^{\infty}\lambda_i\sigma_{i}(s)\sigma_{i}(t)$. Using this we explictly construct the associated RKHS for $K$ as the set of functions expressible as a linear combination $\sum_{i=1}^{\infty}f_i\sigma_i(x)$ such that $\sum_{i=1}^{\infty}f_i^2/\lambda_i<\infty$. The inner product is given by $<f,g>_H := \sum_{i=1}^{\infty}f_ig_i/\lambda_i$. One may show that this is the unique RKHS associated with the kernel $K$. Their claim then is that this construction does not depend on the measure $\mu$. I contest this: The set of eigenvectors $\sigma_i$ is highly depedendent on the Hilbert-Schmidt operator which depends on the measure $\mu$. What am I missing here or is this just a false claim? Thanks in advance