RKHS for one-dimensional discontinuous jump functions

113 Views Asked by At

I would like to do Kernel Regression on the space of functions from $\mathbb{R}$ to $\mathbb{R}$ which have a countable number of jump discontinuities, and are otherwise continuous (in particular at each point the left and right limits are well-defined and finite).

The usual theory of RKHSs for machine learning is done with a continuous kernel function $k: \mathbb{R} \times \mathbb{R} \rightarrow \mathbb{R}$ which obviously only generates continuous functions. Question: Is there a way to develop a sound theory for non-parametric regression of functions with jump discontinuities (I'm interested just in the one-dimensional case)?

I have thought of possible work-arounds and I think one could possibly either try to fit the empiric integral of such functions (which is then continuous and can be worked out in the theory of RKHSs) or try to fit such functions in a sequence of RKHSs whose kernels are regularizations of a discontinuous kernel and try to prove gamma-convergence of the minima of the fitting process, but I would like to know if there is an already developed theory that I can use without having to find a work-around.