I cant seem to get my head around the behaviour of different kernels in an univariate case when using a gaussian process regressor.
For example, this is the RBF kernel: $$K(x, x')= exp( - \frac{ ||x-x'||^2 }{ 2\sigma^2 })$$ In the univariate case $x' = x$, so the kernel becomes $K(x, x)=exp( -\frac{ ||x-x||^2 }{ 2\sigma^2 })$ which resolves to $K(x,x) = e^0$
So from what I know right now, every kernel function yields $1$ for every given value and still, changing the kernel parameters for a kernel changes the outcome of the prediction when using the gaussian process regressor.
I've tried this with scikit-learns GP implementation, using various kernels. Every kernel has the euclidean distance term in its exponent so it should result to 1 in the univariate case. But still, messing around with the various kernel parameters, changes the quality of the learned model.
Please enlighten me! Thanks in advance!
In a single univariate, $x'$ need not be equal to $x$.
For example you can have $0$ and $1$ and $$K(0,1)=\exp\left(-\frac1{2\sigma^2} \right)$$
Changing $\sigma$ would change the kernel value.