I know universal approximation theorem says that neural network can approximate every function to arbitrary precision, however it does not bound the size of the neural network. It seems that a more useful theorem would be “what kind of functions can be approximated by polynomial-sized neural networks”. I did some literature search and found Theorem 1 in https://arxiv.org/pdf/1410.1141.pdf, which says that every function can be implemented in polynomial time can be expressed by a poly-sized neural network. However, it seems that this theorem is restricted to boolean inputs. I'm wondering if it's also true that for every real-valued function that can be evaluated in polynomial time, we can always construct a corresponding neural network of polynomial-size to approximate it (with arbitrary precision)?
Really appreciate if someone can point out some relevant literature for this question.