Why the universe of neural network is a ring K?

106 Views Asked by At

I wish to find an algebraic or category theory approach to describe neural network in particular to have use algebraic method to extract the concept of 'synaptic weight'. I find this for the moment, pag 4-5

Definition 1 The universe of a neural network is a set that contains all of the possible values in a neural network. The universe of neural network is a ring K.

Remark 1 We have not defined what the ring K is. Basically, the universe of a neural network is a set that enables the operation between the components with some properties such as closure. Normally the universe of a neural network is a set of real numbers.

Instead, in standard definition we have a bad approach

synaptic weight is amplitude of a connection between two nodes The amplitude of a periodic variable is a measure of its change over a single period

I don't like this poor description in terms of physical or biological terms, I don't like terms 'amplitude', 'neuron' or 'synaptic weights', I prefer something like this

In a computational neural network, a vector or set of inputs x and outputs y, or pre- and post-synaptic neurons respectively, are interconnected with synaptic weights represented by the matrix w, where for a linear neuron

But I don't like term 'output' because this carry a perception of some 'signal' but in mathematical terms what can be this 'output'?