Would non-parametric functional neural networks be possible?

41 Views Asked by At

Most neural networks have activation functions which are sigmoids and then are simply linear combinations (from previous layers of outputs coming from other linear combination of sigmoids outputs and so on...).

Often one has an explicit algebraic / analytic function and it's derivative ready for calculation. But what if instead of altering a coefficient we would alter the function itself? So what is unique for each neuron is not a linear weight (on it's axions insulation), strengthening or weakening the signal to the next layer, but instead it is the very function that works on linear combinations on it's input sensors (dendrites).

Expressed with equations: Instead of keeping $\sigma_k \forall k$ static and optimizing wrt ${\bf W_k}, \forall k$ in $$\min_{{\bf W_k}}\|\sigma_n({\bf W_{n}}\cdots\sigma_2({\bf W_2}\sigma_1({\bf W_1v}))\cdots) - gt\|$$ We optimize wrt to $\sigma_k, \forall k$ instead. $$\min_{\sigma_k}\|\sigma_n({\bf W_{n}}\cdots\sigma_2({\bf W_2}\sigma_1({\bf W_1v}))\cdots) - gt\|$$


Since the field of neural networks is quite mature, having existed for over $50$ years and been very active since at least the $1980$s, I am very open for references to published works.