Structure of RKHS induced by a Gaussian kernel

138 Views Asked by At

I am studying Reproducing Kernel Hilbert Space, in the context of Maximum Mean Discrepancy. The following points summarize what I've understood up to now

  • If $X$ is a set, $H$ Hilbert space, then $k: X \times X \to \mathbb{R}$ is a reproducing kernel if the reproducing property holds, i.e. for every $ x \in X $, for every $ f \in H$ \begin{equation} \langle f, k ( \cdot , x) \rangle _ H = f (x) \end{equation}
  • A function $k: X \times X \to \mathbb{R}$ is a kernel iff exists $H $ Hilbert space and $\phi :X \to H $ such that \begin{equation}k(x , y ) = \langle{\phi( x ) }, {\phi ( y) } \rangle_H \end{equation}
  • From these definitions, a reproducing kernel in a RKHS $ H $ is clearly a kernel, with associated feature map $ \phi ( x) = k ( \cdot , x ) $. Conversely, each positive definite function $ k ( x , y) $ can be associated to an unique RKHS H, which has k as its reproducing kernel (Moore-Aronszajn them)

So my question is the following. Suppose to simply consider $ k : X \times X \to \mathbb R $ as $ k ( x_ 1, x _2 ) = e ^ {- \frac{ \|{x_ 1 - x_ 2 } \| ^ 2} {2 \sigma ^ 2 } }$, the Gaussian kernel, and consider the same kernel, but defined on another space $Y $, so $ k :Y \times Y \to \mathbb R $ as $ k ( y_ 1, y _2 ) = e ^ {- \frac{ \|{ y_ 1 - y_ 2 } \| ^ 2} {2 \sigma ^ 2 } }$, with other properties and a priori very different (within a ML context, you can think of X as feature space and Y as label space).

What is the relationship between the two and do they induce the same RKHS? Namely, if $ H ' $ and $ H $ are two different spaces, and I write \begin{equation}\langle{\phi( x_1 ) }, {\phi ( x_1) } \rangle_H \quad \quad \langle{\tilde \phi(y_1 ) }, {\tilde \phi ( y_2 ) } \rangle_{H'} \end{equation} will be $ H = H'$? And $ \phi =\tilde \phi $?

Maybe it is a trivial question, but if someone could clarify these points it would be really appreciated