One of the most well known kernels (which in my case is used as a covariance matrix of a Gaussian process) is the dot product kernel $k\left(x, x'\right)= x \cdot x' =\sum_{i=1}^{n} x_{i} x'_{i}$ and $x \in \mathbb{R}^n$. It's symmetric and positive semidefinite, hence a valid kernel.
For an infinite dimensional vector, $y \in \mathbb{R}^\infty$, which I would interpret as a function, not a vector (I'm shaky here), e.g. $y = e^{-\frac{1}{2}r^2}$, would
$$k(y, y') = y \cdot y' = \sum_{i=1}^\infty y_i y'_i = \int y y' \mathrm{d} r$$
also be a valid kernel? It seems symmetric but I'm not sure it is positive semidefinite and whether my handwavy way of going from the sum to the integral is even vaguely correct.
EDIT: I found a "counter example". Negative eigenvalues imply that the kernel is not positive semidefinite (PSD). With a dataset of three data points
$$\vec{y}=(y_1, y_2, y_3)^T =\begin{align}\begin{pmatrix} \mathcal{N}(-1,1.5)+\mathcal{N}(0, 0.1)\\ \mathcal{N}(2,1.5)+\mathcal{N}(3, 0.1)\\ \mathcal{N}(4,1.5)+\mathcal{N}(5, 0.1)) \end{pmatrix} \end{align} \ , $$
where $\mathcal{N}(\mu, \sigma)$ denotes a probability density function of the normal distribution, $\mathcal{N}(\mu, \sigma) = \frac{e^{-\frac{(x-\mu)^{2}}{2\sigma^2}}}{\sigma \sqrt{2 \pi}}$, the following gram matrix is generated,
$$ K = \int (y_1, y_2, y_3)^T \cdot (y_1, y_2, y_3) = \left( \begin{array}{ccc} 3.43442 & 0.186413 & 0.0680187 \\ 0.186413 & 0.613469 & 0.433659 \\ 0.0680187 & 0.433659 & 0.188063 \\ \end{array} \right) \ . $$
(Note, all elements have a tiny imaginary part which I left out for clarity but it doesn't change the result.) For instance, the top left element is calculated as $\int_{-\infty }^{\infty } \mathcal{N}(-1,1.5)^2 + 2 \mathcal{N}(-1,1.5)\mathcal{N}(0, 0.1)^2 + \mathcal{N}(0, 0.1)^2 dx = 3.43442$. The eigenvalues of this gram matrix, $K$, are 3.44958, 0.869092, and -0.0827286. The negative eigenvalue implies that the matrix is not PSD.
I did all this in Mathematica with
Eigenvalues[
Integrate[
Table[
{PDF[NormalDistribution[m, 1.5], x] +
PDF[NormalDistribution[m + 1, 0.1], x]}, {m, {-1, 2, 3}}
] . {Table[
PDF[NormalDistribution[m, 1.5], x] +
PDF[NormalDistribution[m + 1, 0.1], x], {m, {-1, 2, 3}}
]},
{x, -Infinity, Infinity}]]
First, you are correct about writing the summation as the integral. Look here - Integration with respect to counting measure.
And yes, it is the same as for the dot product kernel because it is the inner product on the space of sequences $l^2$ $$\sum_i^n\sum_j^nc_ic_j\sum_m^\infty y_{i,m}y_{j,m}=\sum_i^nc_i\sum_m^\infty y_{i,m}(\sum_j^n c_jy_{j,m})=\sum_m^\infty (\sum_i^n c_iy_{i,m})(\sum_j^n c_jy_{j,m})=\sum_m^\infty (\sum_i^n c_iy_{i,m})^2\geq0$$ Because finite and infinite sums (or integrals) can be exchanged.