Universality of artificial neural networks - can they approximate L^2 or L^p space?

196 Views Asked by At

I was reading about the Universal Approximation Theorem for neural networks, and it seems like it is valid only for continuous functions defined on the unit hypercube in n-dimensional space: "the space of continuous functions on $I_{m}$ is denoted by $C(I_{m})$"

My question is: is this function space a sub-space of $L^2$? If so, does that mean that ANNs are not really "universal" after all? Can statistics or signal processing techniques approximate such functions that cannot be approximated by ANNs?