How to drive an expression between input and output layer in ReLU-Neural Network?

25 Views Asked by At

A relu-dnn model maps the input to output with a piecewise linear relationship. The output of each neuron is 0 or the input of that neuron. So, the whole model can be expressed as an affine function. y = Ax + B where, "x" is the input dataset and "y" is the output of the neural network model.

How can I determine the values of "A" and "B"? Is there any systematic approach to get these coefficients? I trained the model and have the weights.