Vapnik-Chervonenkis (VC) classes of functions are important in empirical process theory, and in statistical learning theory. However, recovering the VC index (or VC dimension) of a class of functions is not easy.
I would like to compile a list of (useful) function classes that are also VC classes, but it has been surprisingly hard to find many examples of VC classes either online or in the classic empirical process texts...
Does anyone know of a reference that gives good (and many) examples of VC classes?
Unfortunately, I don't know of any such comprehensive list, but let me gather some of the disparate resources I've seen. Maybe it could be useful. Most of it is from the learning perspective.
Overall, perhaps surprisingly, I get the feeling that such a list will have to be constructed from individual papers in the literature on specific examples. Most of the recent VC dimension work I am aware of relates to neural networks (e.g. Nearly-tight VC-dimension bounds for piecewise linear neural networks. Harvey et al. 2017.) or other learners, like decision trees
Other related questions:
Resource listing models with known VC dimension [TCS]
Resource / book for recent advances in statistical learning theory [TCS]
Introductory resources on Computational Learning Theory [TCS]