Linear Activation Function in Neural Network

73 Views Asked by At

Graph of output vs. weighted sum
The linear activation function $$ z = b+\sum_{i=0}^k x_iw_i $$ Is used in simple linear neurons in a neural network. Graphs of the z v. the weighted sum of the inputs plus the bias show the line passing through (0, 0), the origin. Is this a strict requirement? Can the line not pass through the origin?

1

There are 1 best solutions below

1
On

This is not a strict requirement for activation functions in general, but for the simple linear neuron it is. For instance, with the sigmoid activation function, we typically have a $y$-intercept of $1/2$, but for a linear activation $0$ is typical.