Are there some examples or research trends to find approximating expansions to nonlinear or non-smooth functions that have some nice properties from the Taylor expansion - e.g. possibility to use some first terms only to achieve good approximation.
I have read previous question Taylor expansion of a non smooth function but it is very narrow in scope and the answer is similarly narrow. There is some hint that there may be factional Taylor expansions for that https://www.sciencedirect.com/science/article/pii/S0898122106000861 ("Modified Riemann-Liouville derivative and fractional Taylor series of nondifferentiable functions further results") but is it general theory for nonlinear approximation and the state-of-the-art?
I am trying to think about approximating ReLU neural function from the deep learning theory in the style of https://arxiv.org/abs/2106.10165 but I need some guidance what tools can be applied for that. The mentioned book stops exactly at the same question.
I'm not sure how much help this is as an answer, but I recently had the same or similar question and found a math paper that ostensibly works towards this: Piecewise Polynomial Taylor Expansions—The Generalization of Faà di Bruno’s Formula.
It appears to generalize Taylor's theorem and series, normally used for smooth functions, to use with non-smooth functions. They do so with the help of Faà di Bruno’s Formula. There are several similar papers by this research group. Hopefully this can point you in a good direction. My research background is more on the engineering side, and unfortunately I don't have much skill deciphering math papers or locating the practical stuff among the proofs and prepositions.